From paulasan at gmail.com Thu Nov 1 11:48:29 2018 From: paulasan at gmail.com (Paula Sanematsu) Date: Thu, 1 Nov 2018 12:48:29 -0400 Subject: [petsc-users] PetscFinalize at root processor only Message-ID: Hi, I have a Fortran code that calls an external function only at the root processor. If this external function returns an error code that indicates that it failed, then I call PetscFinalize() only at the root processor. For example: if(rank==0) then metis_call_status = METIS_SetDefaultOptions(opts) if( metis_call_status /= 1 ) then call PetscFinalize(ierr) stop end if end if I noticed that this will cause the other processors to "hang". I don't want to change the Metis call. Do you have a suggestion on how to propagate PetscFinalize to other processors? Or another solution? Thank you, Paula -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Nov 1 12:26:59 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Thu, 1 Nov 2018 17:26:59 +0000 Subject: [petsc-users] PetscFinalize at root processor only In-Reply-To: References: Message-ID: <1C46F0D4-745F-423E-9754-8B5000615396@anl.gov> The only way to do it is to put a broadcast after the call to METIS on process 0 and have all the other processes wait for the result; if not 1 then all processes call PetscFinalize() that the same time. Barry > On Nov 1, 2018, at 11:48 AM, Paula Sanematsu via petsc-users wrote: > > Hi, > > I have a Fortran code that calls an external function only at the root processor. If this external function returns an error code that indicates that it failed, then I call PetscFinalize() only at the root processor. For example: > > if(rank==0) then > metis_call_status = METIS_SetDefaultOptions(opts) > if( metis_call_status /= 1 ) then > call PetscFinalize(ierr) > stop > end if > end if > > I noticed that this will cause the other processors to "hang". I don't want to change the Metis call. Do you have a suggestion on how to propagate PetscFinalize to other processors? Or another solution? > > Thank you, > > Paula From jed at jedbrown.org Thu Nov 1 12:52:31 2018 From: jed at jedbrown.org (Jed Brown) Date: Thu, 01 Nov 2018 11:52:31 -0600 Subject: [petsc-users] PetscFinalize at root processor only In-Reply-To: References: Message-ID: <87zhusit28.fsf@jedbrown.org> PetscFinalize is collective, similar to MPI_Finalize. You could MPI_Bcast the result if you want to cleanly exit. If you want to raise an error, you could use SETERRA as in other PETSc Fortran examples. Paula Sanematsu via petsc-users writes: > Hi, > > I have a Fortran code that calls an external function only at the root > processor. If this external function returns an error code that indicates > that it failed, then I call PetscFinalize() only at the root processor. For > example: > > if(rank==0) then > metis_call_status = METIS_SetDefaultOptions(opts) > if( metis_call_status /= 1 ) then > call PetscFinalize(ierr) > stop > end if > end if > > I noticed that this will cause the other processors to "hang". I don't want > to change the Metis call. Do you have a suggestion on how to propagate > PetscFinalize to other processors? Or another solution? > > Thank you, > > Paula From mfadams at lbl.gov Thu Nov 1 15:42:22 2018 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 1 Nov 2018 16:42:22 -0400 Subject: [petsc-users] DIVERGED_NANORING with PC GAMG In-Reply-To: References: <34D20379-DEAC-4E98-8FA8-E7EC2831BE6D@ic.ac.uk> <991561e5-1b9a-dbb2-333d-501251967077@imperial.ac.uk> Message-ID: On Wed, Oct 31, 2018 at 8:11 PM Smith, Barry F. wrote: > > > > On Oct 31, 2018, at 5:39 PM, Appel, Thibaut via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Well yes naturally for the residual but adding -ksp_true_residual just > gives > > > > 0 KSP unpreconditioned resid norm 3.583290589961e+00 true resid norm > 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 > > 1 KSP unpreconditioned resid norm 0.000000000000e+00 true resid norm > 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 > > Linear solve converged due to CONVERGED_ATOL iterations 1 > > Very bad stuff is happening in the preconditioner. The preconditioner > must have a null space (which it shouldn't have to be a useful > preconditioner). > Yea, you are far away from an optimal preconditioner for this system. In low frequency (indefinite) Helmholtz is very very hard. Now, something very bad is going on here but even if you fix it standard AMG is not good for these problems. I would use direct solvers or grind away it with ILU. > > > > > Mark - if that helps - a Poisson equation is used for the pressure so > the Helmholtz is the same as for the velocity in the interior. > > > > Thibaut > > > >> Le 31 oct. 2018 ? 21:05, Mark Adams a ?crit : > >> > >> These are indefinite (bad) Helmholtz problems. Right? > >> > >> On Wed, Oct 31, 2018 at 2:38 PM Matthew Knepley > wrote: > >> On Wed, Oct 31, 2018 at 2:13 PM Thibaut Appel > wrote: > >> Hi Mark, Matthew, > >> > >> Thanks for taking the time. > >> > >> 1) You're not suggesting having -fieldsplit_X_ksp_type fgmres for each > field, are you? > >> > >> 2) No, the matrix has pressure in one of the fields. Here it's a 2D > problem (but we're also doing 3D), the unknowns are (p,u,v) and those are > my 3 fields. We are dealing with subsonic/transsonic flows so it is > convection dominated indeed. > >> > >> 3) We are in frequency domain with respect to time, i.e. > \partial{phi}/\partial{t} = -i*omega*phi. > >> > >> 4) Hypre is unfortunately not an option since we are in complex > arithmetic. > >> > >> > >> > >>> I'm not sure about "-fieldsplit_pc_type gamg" GAMG should work on one > block, and hence be a subpc. I'm not up on fieldsplit syntax. > >> According to the online manual page this syntax applies the suffix to > all the defined fields? > >> > >> > >> > >>> Mark is correct. I wanted you to change the smoother. He shows how to > change it to Richardson (make sure you add the self-scale option), which is > probably the best choice. > >>> > >>> Thanks, > >>> > >>> Matt > >> > >> You did tell me to set it to GMRES if I'm not mistaken, that's why I > tried "-fieldsplit_mg_levels_ksp_type gmres" (mentioned in the email). > Also, it wasn't clear whether these should be applied to each block or the > whole system, as the online manual pages + .pdf manual barely mention > smoothers and how to manipulate MG objects with KSP/PC, this especially > with PCFIELDSPLIT where examples are scarce. > >> > >> From what I can gather from your suggestions I tried (lines with X are > repeated for X={0,1,2}) > >> > >> This looks good. How can an identically zero vector produce a 0 > residual? You should always monitor with > >> > >> -ksp_monitor_true_residual. > >> > >> Thanks, > >> > >> Matt > >> -ksp_view_pre -ksp_monitor -ksp_converged_reason \ > >> -ksp_type fgmres -ksp_rtol 1.0e-8 \ > >> -pc_type fieldsplit \ > >> -pc_fieldsplit_type multiplicative \ > >> -pc_fieldsplit_block_size 3 \ > >> -pc_fieldsplit_0_fields 0 \ > >> -pc_fieldsplit_1_fields 1 \ > >> -pc_fieldsplit_2_fields 2 \ > >> -fieldsplit_X_pc_type gamg \ > >> -fieldsplit_X_ksp_type gmres \ > >> -fieldsplit_X_ksp_rtol 1e-10 \ > >> -fieldsplit_X_mg_levels_ksp_type richardson \ > >> -fieldsplit_X_mg_levels_pc_type sor \ > >> -fieldsplit_X_pc_gamg_agg_nsmooths 0 \ > >> -fieldsplit_X_mg_levels_ksp_richardson_self_scale \ > >> -log_view > >> > >> which yields > >> > >> KSP Object: 1 MPI processes > >> type: fgmres > >> restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > >> happy breakdown tolerance 1e-30 > >> maximum iterations=10000, initial guess is zero > >> tolerances: relative=1e-08, absolute=1e-50, divergence=10000. > >> left preconditioning > >> using DEFAULT norm type for convergence test > >> PC Object: 1 MPI processes > >> type: fieldsplit > >> PC has not been set up so information may be incomplete > >> FieldSplit with MULTIPLICATIVE composition: total splits = 3, > blocksize = 3 > >> Solver info for each split is in the following KSP objects: > >> Split number 0 Fields 0 > >> KSP Object: (fieldsplit_0_) 1 MPI processes > >> type: preonly > >> maximum iterations=10000, initial guess is zero > >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > >> left preconditioning > >> using DEFAULT norm type for convergence test > >> PC Object: (fieldsplit_0_) 1 MPI processes > >> type not yet set > >> PC has not been set up so information may be incomplete > >> Split number 1 Fields 1 > >> KSP Object: (fieldsplit_1_) 1 MPI processes > >> type: preonly > >> maximum iterations=10000, initial guess is zero > >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > >> left preconditioning > >> using DEFAULT norm type for convergence test > >> PC Object: (fieldsplit_1_) 1 MPI processes > >> type not yet set > >> PC has not been set up so information may be incomplete > >> Split number 2 Fields 2 > >> KSP Object: (fieldsplit_2_) 1 MPI processes > >> type: preonly > >> maximum iterations=10000, initial guess is zero > >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > >> left preconditioning > >> using DEFAULT norm type for convergence test > >> PC Object: (fieldsplit_2_) 1 MPI processes > >> type not yet set > >> PC has not been set up so information may be incomplete > >> linear system matrix = precond matrix: > >> Mat Object: 1 MPI processes > >> type: seqaij > >> rows=52500, cols=52500 > >> total: nonzeros=1127079, allocated nonzeros=1128624 > >> total number of mallocs used during MatSetValues calls =0 > >> not using I-node routines > >> 0 KSP Residual norm 3.583290589961e+00 > >> 1 KSP Residual norm 0.000000000000e+00 > >> Linear solve converged due to CONVERGED_ATOL iterations 1 > >> > >> so something must not be set correctly. The solution is identically > zero everywhere. > >> > >> Is that option list what you meant? If you could let me know what > should be corrected. > >> > >> > >> > >> Thanks for your support, > >> > >> > >> > >> Thibaut > >> > >> > >> > >> On 31/10/2018 16:43, Mark Adams wrote: > >>> > >>> > >>> On Tue, Oct 30, 2018 at 5:23 PM Appel, Thibaut via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >>> Dear users, > >>> > >>> Following a suggestion from Matthew Knepley I?ve been trying to apply > fieldsplit/gamg for my set of PDEs but I?m still encountering issues > despite various tests. pc_gamg simply won?t start. > >>> Note that direct solvers always yield the correct, physical result. > >>> Removing the fieldsplit to focus on the gamg bit and trying to solve > the linear system on a modest size problem still gives, with > >>> > >>> '-ksp_monitor -ksp_rtol 1.0e-10 -ksp_gmres_restart 300 -ksp_type gmres > -pc_type gamg' > >>> > >>> [3]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > >>> [3]PETSC ERROR: Petsc has generated inconsistent data > >>> [3]PETSC ERROR: Have un-symmetric graph (apparently). Use > '-(null)pc_gamg_sym_graph true' to symetrize the graph or > '-(null)pc_gamg_threshold -1' if the matrix is structurally symmetric. > >>> > >>> And since then, after adding '-pc_gamg_sym_graph true' I have been > getting > >>> [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > >>> [0]PETSC ERROR: Petsc has generated inconsistent data > >>> [0]PETSC ERROR: Eigen estimator failed: DIVERGED_NANORINF at iteration > >>> > >>> -ksp_chebyshev_esteig_noisy 0/1 does not change anything > >>> > >>> Knowing that Chebyshev eigen estimator needs a positive spectrum I > tried ?-mg_levels_ksp_type gmres? but iterations would just go on endlessly. > >>> > >>> This is OK, but you need to use '-ksp_type fgmres' (this could be why > it is failing ...). > >>> > >>> It looks like your matrix is 1) just the velocity field and 2) very > unsymmetric (eg, convection dominated). I would start with > ?-mg_levels_ksp_type richardson -mg_levels_pc_type sor?. > >>> > >>> I would also start with unsmoothed aggregation: '-pc_gamg_nsmooths 0' > >>> > >>> > >>> It seems that I have indeed eigenvalues of rather high magnitude in > the spectrum of my operator without being able to determine the reason. > >>> The eigenvectors look like small artifacts at the wall-inflow or > wall-outflow corners with zero anywhere else but I do not know how to > interpret this. > >>> Equations are time-harmonic linearized Navier-Stokes to which a > forcing is applied, there?s no time-marching. > >>> > >>> You mean you are in frequency domain? > >>> > >>> > >>> Matrix is formed with a MPIAIJ type. The formulation is > incompressible, in complex arithmetic and the 2D physical domain is mapped > to a logically rectangular, > >>> > >>> This kind of messes up the null space that AMG depends on but AMG > theory is gone for NS anyway. > >>> > >>> regular collocated grid with a high-order finite difference method. > >>> I determine the ownership of the rows/degrees of freedom of the matrix > with PetscSplitOwnership and I?m not using DMDA. > >>> > >>> Our iterative solvers are probably not going to work well on this but > you should test hypre also (-pc_type hypre -pc_hypre_type boomeramg). You > need to configure PETSc to download hypre. > >>> > >>> Mark > >>> > >>> > >>> The Fortran application code is memory-leak free and has undergone a > strict verification/validation procedure for different variations of the > PDEs. > >>> > >>> If there?s any problem with the matrix what could help for the > diagnostic? At this point I?m running out of ideas so I would really > appreciate additional suggestions and discussions. > >>> > >>> Thanks for your continued support, > >>> > >>> > >>> Thibaut > >> > >> > >> -- > >> What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > >> -- Norbert Wiener > >> > >> https://www.cse.buffalo.edu/~knepley/ > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.appel17 at imperial.ac.uk Thu Nov 1 16:16:02 2018 From: t.appel17 at imperial.ac.uk (Appel, Thibaut) Date: Thu, 1 Nov 2018 21:16:02 +0000 Subject: [petsc-users] DIVERGED_NANORING with PC GAMG In-Reply-To: References: <34D20379-DEAC-4E98-8FA8-E7EC2831BE6D@ic.ac.uk> <991561e5-1b9a-dbb2-333d-501251967077@imperial.ac.uk> Message-ID: <2272CD83-71BD-4F5E-965C-196EDA611A56@ic.ac.uk> What does "low frequency (indefinite) Helmholtz" mean exactly? (How is my problem low-frequency helmholtz) Thibaut Le 1 nov. 2018 ? 20:42, Mark Adams > a ?crit : On Wed, Oct 31, 2018 at 8:11 PM Smith, Barry F. > wrote: > On Oct 31, 2018, at 5:39 PM, Appel, Thibaut via petsc-users > wrote: > > Well yes naturally for the residual but adding -ksp_true_residual just gives > > 0 KSP unpreconditioned resid norm 3.583290589961e+00 true resid norm 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP unpreconditioned resid norm 0.000000000000e+00 true resid norm 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 > Linear solve converged due to CONVERGED_ATOL iterations 1 Very bad stuff is happening in the preconditioner. The preconditioner must have a null space (which it shouldn't have to be a useful preconditioner). Yea, you are far away from an optimal preconditioner for this system. In low frequency (indefinite) Helmholtz is very very hard. Now, something very bad is going on here but even if you fix it standard AMG is not good for these problems. I would use direct solvers or grind away it with ILU. > > Mark - if that helps - a Poisson equation is used for the pressure so the Helmholtz is the same as for the velocity in the interior. > > Thibaut > >> Le 31 oct. 2018 ? 21:05, Mark Adams > a ?crit : >> >> These are indefinite (bad) Helmholtz problems. Right? >> >> On Wed, Oct 31, 2018 at 2:38 PM Matthew Knepley > wrote: >> On Wed, Oct 31, 2018 at 2:13 PM Thibaut Appel > wrote: >> Hi Mark, Matthew, >> >> Thanks for taking the time. >> >> 1) You're not suggesting having -fieldsplit_X_ksp_type fgmres for each field, are you? >> >> 2) No, the matrix has pressure in one of the fields. Here it's a 2D problem (but we're also doing 3D), the unknowns are (p,u,v) and those are my 3 fields. We are dealing with subsonic/transsonic flows so it is convection dominated indeed. >> >> 3) We are in frequency domain with respect to time, i.e. \partial{phi}/\partial{t} = -i*omega*phi. >> >> 4) Hypre is unfortunately not an option since we are in complex arithmetic. >> >> >> >>> I'm not sure about "-fieldsplit_pc_type gamg" GAMG should work on one block, and hence be a subpc. I'm not up on fieldsplit syntax. >> According to the online manual page this syntax applies the suffix to all the defined fields? >> >> >> >>> Mark is correct. I wanted you to change the smoother. He shows how to change it to Richardson (make sure you add the self-scale option), which is probably the best choice. >>> >>> Thanks, >>> >>> Matt >> >> You did tell me to set it to GMRES if I'm not mistaken, that's why I tried "-fieldsplit_mg_levels_ksp_type gmres" (mentioned in the email). Also, it wasn't clear whether these should be applied to each block or the whole system, as the online manual pages + .pdf manual barely mention smoothers and how to manipulate MG objects with KSP/PC, this especially with PCFIELDSPLIT where examples are scarce. >> >> From what I can gather from your suggestions I tried (lines with X are repeated for X={0,1,2}) >> >> This looks good. How can an identically zero vector produce a 0 residual? You should always monitor with >> >> -ksp_monitor_true_residual. >> >> Thanks, >> >> Matt >> -ksp_view_pre -ksp_monitor -ksp_converged_reason \ >> -ksp_type fgmres -ksp_rtol 1.0e-8 \ >> -pc_type fieldsplit \ >> -pc_fieldsplit_type multiplicative \ >> -pc_fieldsplit_block_size 3 \ >> -pc_fieldsplit_0_fields 0 \ >> -pc_fieldsplit_1_fields 1 \ >> -pc_fieldsplit_2_fields 2 \ >> -fieldsplit_X_pc_type gamg \ >> -fieldsplit_X_ksp_type gmres \ >> -fieldsplit_X_ksp_rtol 1e-10 \ >> -fieldsplit_X_mg_levels_ksp_type richardson \ >> -fieldsplit_X_mg_levels_pc_type sor \ >> -fieldsplit_X_pc_gamg_agg_nsmooths 0 \ >> -fieldsplit_X_mg_levels_ksp_richardson_self_scale \ >> -log_view >> >> which yields >> >> KSP Object: 1 MPI processes >> type: fgmres >> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >> happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-08, absolute=1e-50, divergence=10000. >> left preconditioning >> using DEFAULT norm type for convergence test >> PC Object: 1 MPI processes >> type: fieldsplit >> PC has not been set up so information may be incomplete >> FieldSplit with MULTIPLICATIVE composition: total splits = 3, blocksize = 3 >> Solver info for each split is in the following KSP objects: >> Split number 0 Fields 0 >> KSP Object: (fieldsplit_0_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using DEFAULT norm type for convergence test >> PC Object: (fieldsplit_0_) 1 MPI processes >> type not yet set >> PC has not been set up so information may be incomplete >> Split number 1 Fields 1 >> KSP Object: (fieldsplit_1_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using DEFAULT norm type for convergence test >> PC Object: (fieldsplit_1_) 1 MPI processes >> type not yet set >> PC has not been set up so information may be incomplete >> Split number 2 Fields 2 >> KSP Object: (fieldsplit_2_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using DEFAULT norm type for convergence test >> PC Object: (fieldsplit_2_) 1 MPI processes >> type not yet set >> PC has not been set up so information may be incomplete >> linear system matrix = precond matrix: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=52500, cols=52500 >> total: nonzeros=1127079, allocated nonzeros=1128624 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> 0 KSP Residual norm 3.583290589961e+00 >> 1 KSP Residual norm 0.000000000000e+00 >> Linear solve converged due to CONVERGED_ATOL iterations 1 >> >> so something must not be set correctly. The solution is identically zero everywhere. >> >> Is that option list what you meant? If you could let me know what should be corrected. >> >> >> >> Thanks for your support, >> >> >> >> Thibaut >> >> >> >> On 31/10/2018 16:43, Mark Adams wrote: >>> >>> >>> On Tue, Oct 30, 2018 at 5:23 PM Appel, Thibaut via petsc-users > wrote: >>> Dear users, >>> >>> Following a suggestion from Matthew Knepley I?ve been trying to apply fieldsplit/gamg for my set of PDEs but I?m still encountering issues despite various tests. pc_gamg simply won?t start. >>> Note that direct solvers always yield the correct, physical result. >>> Removing the fieldsplit to focus on the gamg bit and trying to solve the linear system on a modest size problem still gives, with >>> >>> '-ksp_monitor -ksp_rtol 1.0e-10 -ksp_gmres_restart 300 -ksp_type gmres -pc_type gamg' >>> >>> [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- >>> [3]PETSC ERROR: Petsc has generated inconsistent data >>> [3]PETSC ERROR: Have un-symmetric graph (apparently). Use '-(null)pc_gamg_sym_graph true' to symetrize the graph or '-(null)pc_gamg_threshold -1' if the matrix is structurally symmetric. >>> >>> And since then, after adding '-pc_gamg_sym_graph true' I have been getting >>> [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- >>> [0]PETSC ERROR: Petsc has generated inconsistent data >>> [0]PETSC ERROR: Eigen estimator failed: DIVERGED_NANORINF at iteration >>> >>> -ksp_chebyshev_esteig_noisy 0/1 does not change anything >>> >>> Knowing that Chebyshev eigen estimator needs a positive spectrum I tried ?-mg_levels_ksp_type gmres? but iterations would just go on endlessly. >>> >>> This is OK, but you need to use '-ksp_type fgmres' (this could be why it is failing ...). >>> >>> It looks like your matrix is 1) just the velocity field and 2) very unsymmetric (eg, convection dominated). I would start with ?-mg_levels_ksp_type richardson -mg_levels_pc_type sor?. >>> >>> I would also start with unsmoothed aggregation: '-pc_gamg_nsmooths 0' >>> >>> >>> It seems that I have indeed eigenvalues of rather high magnitude in the spectrum of my operator without being able to determine the reason. >>> The eigenvectors look like small artifacts at the wall-inflow or wall-outflow corners with zero anywhere else but I do not know how to interpret this. >>> Equations are time-harmonic linearized Navier-Stokes to which a forcing is applied, there?s no time-marching. >>> >>> You mean you are in frequency domain? >>> >>> >>> Matrix is formed with a MPIAIJ type. The formulation is incompressible, in complex arithmetic and the 2D physical domain is mapped to a logically rectangular, >>> >>> This kind of messes up the null space that AMG depends on but AMG theory is gone for NS anyway. >>> >>> regular collocated grid with a high-order finite difference method. >>> I determine the ownership of the rows/degrees of freedom of the matrix with PetscSplitOwnership and I?m not using DMDA. >>> >>> Our iterative solvers are probably not going to work well on this but you should test hypre also (-pc_type hypre -pc_hypre_type boomeramg). You need to configure PETSc to download hypre. >>> >>> Mark >>> >>> >>> The Fortran application code is memory-leak free and has undergone a strict verification/validation procedure for different variations of the PDEs. >>> >>> If there?s any problem with the matrix what could help for the diagnostic? At this point I?m running out of ideas so I would really appreciate additional suggestions and discussions. >>> >>> Thanks for your continued support, >>> >>> >>> Thibaut >> >> >> -- >> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Nov 1 17:13:34 2018 From: jed at jedbrown.org (Jed Brown) Date: Thu, 01 Nov 2018 16:13:34 -0600 Subject: [petsc-users] DIVERGED_NANORING with PC GAMG In-Reply-To: <2272CD83-71BD-4F5E-965C-196EDA611A56@ic.ac.uk> References: <34D20379-DEAC-4E98-8FA8-E7EC2831BE6D@ic.ac.uk> <991561e5-1b9a-dbb2-333d-501251967077@imperial.ac.uk> <2272CD83-71BD-4F5E-965C-196EDA611A56@ic.ac.uk> Message-ID: <87k1lwigz5.fsf@jedbrown.org> "Appel, Thibaut via petsc-users" writes: > What does "low frequency (indefinite) Helmholtz" mean exactly? (How is my problem low-frequency helmholtz) What equation are you solving? Mark probably means "high frequency" because that is the difficult case. The low frequency limit is not so bad. From jed at jedbrown.org Thu Nov 1 18:02:04 2018 From: jed at jedbrown.org (Jed Brown) Date: Thu, 01 Nov 2018 17:02:04 -0600 Subject: [petsc-users] DIVERGED_NANORING with PC GAMG In-Reply-To: <373323C3-E788-40D0-833D-67BB9C6FDA6B@imperial.ac.uk> References: <739CB2F7-9DD2-4E0E-A19A-673524B7BE73@ic.ac.uk> <87y3bv8rvg.fsf@jedbrown.org> <31788D7F-9B66-4170-ADAD-EDD39FFC7E38@imperial.ac.uk> <62703A53-8798-445E-92BA-FC6EA8807F6E@ic.ac.uk> <87h8ihwruf.fsf@jedbrown.org> <3d17a345-bb4f-fec0-aed4-56ba7c1cd89c@imperial.ac.uk> <7DF9863D-945E-4C72-9F70-92336ECBA2A4@ic.ac.uk> <87k1n1ihfo.fsf@jedbrown.org> <373323C3-E788-40D0-833D-67BB9C6FDA6B@imperial.ac.uk> Message-ID: <87bm78ieqb.fsf@jedbrown.org> "Appel, Thibaut" writes: > Jed, Mark, > > Attached are the equations I?m solving. > p is perturbation pressure (unknown) > \bar{u} is a base flow (KNOWN) that carries the perturbation velocity field we?re solving for too: u (unknown) This looks like a frequency domain problem associated with incompressible flow. The real-valued part alone is a generalized saddle point problem and not suitable for direct application of algebraic multigrid. (It's common to use fieldsplit first.) The imaginary shift is not as bad as a negative shift (the undamped case for Helmholtz). > -> Unknown vector is (p,u,v)^T > > The right hand side intervenes at the wall where we apply some forcing. > > What do you mean by low-frequency Helmholtz? The Helmholtz equation is the frequency domain acoustic wave equation. High frequency usually means that the number of grid points per wavelength is small, consequently the smallest eigenvalues correspond to relatively oscillatory eigenvectors -- this makes multigrid-type approximation difficult in theory and practice. > (Please re cc? petsc-users if you wish, > I was told a long time ago to not include attachments on mailing lists) From jed at jedbrown.org Fri Nov 2 09:58:14 2018 From: jed at jedbrown.org (Jed Brown) Date: Fri, 02 Nov 2018 08:58:14 -0600 Subject: [petsc-users] Assistant Professorship in Computational Earth-Surface Process Modeling at CU Boulder Message-ID: <87wopvfrw9.fsf@jedbrown.org> The Institute of Arctic and Alpine Research (INSTAAR) and Community Surface Dynamics Modeling System (CSDMS) at the University of Colorado invite applications for a tenure-track assistant professor position in Computational Earth-Surface Process Modeling, with an August 2019 start. CSDMS is an NSF-sponsored facility that supports modeling and prediction of erosion, transport, and deposition of sediment in landscapes and sedimentary basins over a broad range of environments and time and space scales. The candidate will join a team at INSTAAR at the University of Colorado, Boulder, responsible for promoting the growth of CSDMS by developing multidisciplinary and international collaboration in support of earth surface processes modeling. We seek a scientist who takes advantage of new opportunities in computing and remote sensing to advance understanding of landscape, coastal, and/or seascape geomorphology and sediment dynamics. Disciplinary areas of particular interest include, but are not limited to: modeling of terrestrial, coastal, cryospheric, and/or marine processes; use of high-resolution topographic data to test process models; earth surface interactions with the climate system; modeling human impacts; high-performance computing; and community scientific software development. CSDMS is hosted at INSTAAR. The new assistant professor would be rostered in the Research & Innovation Office (RIO) and also become a tenure-track professor in one of INSTAAR?s affiliated academic departments and programs that best suits their training and interests (Geological Sciences; Civil, Environmental, and Architectural Engineering; Atmospheric and Ocean Sciences; Computer Science). A strong commitment to teaching, and mentorship at the undergraduate and graduate levels, and to outreach activities, is expected. The candidate is expected to engage in development of coursework in his or her area of research expertise, and to teach at the undergraduate and graduate level in the home department. Incorporation of computing skills in undergraduate and graduate curricula is desirable. Applicants should have a Ph.D. in geosciences, engineering, or related field at the time of appointment. We seek a strong team player with excellent communication and networking skills, who is focused on achieving multidisciplinary research goals. We welcome candidates who will bring diverse intellectual, geographical, gender, and ethnic perspectives to CSDMS and the University of Colorado campus community. For further information and to apply: https://jobs.colorado.edu/jobs/JobDetail/?jobId=13976 From barrydog505 at gmail.com Fri Nov 2 09:58:17 2018 From: barrydog505 at gmail.com (=?UTF-8?B?6Zmz5a6X6IiI?=) Date: Fri, 2 Nov 2018 22:58:17 +0800 Subject: [petsc-users] How to use PetscViewerVTKGetDM Message-ID: Hi, I have created a DMPlex using DMPlexCreateFromFile, and I use PetscObjectSetName to create the name of DM. I have occurred some problem when I try to use PetscViewerVTKGetDM. This is what I write: DM dm; ... PetscObjectSetName((PetscObject) dm, "Mesh"); ... PetscViewerVTKGetDM(viewer, (PetscObject) dm); And here is the output of waring: warning: passing argument 2 of ?PetscViewerVTKGetDM? from incompatible pointer type [-Wincompatible-pointer-types] ierr = PetscViewerVTKGetDM(viewer, (PetscObject) dm);CHKERRQ(ierr); ^ note: expected ?struct _p_PetscObject **? but argument is of type ?struct _p_PetscObject *' PETSC_EXTERN PetscErrorCode PetscViewerVTKGetDM(PetscViewer,PetscObject*); ^~~~~~~~~~~~~~~~~~~ How can I fix it? Thank you, Barry -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 2 11:14:05 2018 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 2 Nov 2018 12:14:05 -0400 Subject: [petsc-users] How to use PetscViewerVTKGetDM In-Reply-To: References: Message-ID: On Fri, Nov 2, 2018 at 12:11 PM ??? via petsc-users wrote: > Hi, > > I have created a DMPlex using DMPlexCreateFromFile, > and I use PetscObjectSetName to create the name of DM. > I have occurred some problem when I try to use PetscViewerVTKGetDM. > This is what I write: > > DM dm; > ... > PetscObjectSetName((PetscObject) dm, "Mesh"); > ... > PetscViewerVTKGetDM(viewer, (PetscObject) dm); > 1) GetDM() returns a DM, so you would need PetscViewerVTKGetDM(viewer, (PetscObject *) &dm) 2) However, why are you trying to pull out a DM? It does not look like you want to do this. Thanks, Matt > And here is the output of waring: > warning: passing argument 2 of ?PetscViewerVTKGetDM? from > incompatible pointer type [-Wincompatible-pointer-types] > ierr = PetscViewerVTKGetDM(viewer, (PetscObject) dm);CHKERRQ(ierr); > ^ > note: expected ?struct _p_PetscObject **? but argument is of type > ?struct _p_PetscObject *' > PETSC_EXTERN PetscErrorCode > PetscViewerVTKGetDM(PetscViewer,PetscObject*); > > ^~~~~~~~~~~~~~~~~~~ > > How can I fix it? > > Thank you, > > Barry > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From wence at gmx.li Fri Nov 2 11:20:11 2018 From: wence at gmx.li (Lawrence Mitchell) Date: Fri, 2 Nov 2018 16:20:11 +0000 Subject: [petsc-users] How to use PetscViewerVTKGetDM In-Reply-To: References: Message-ID: > On 2 Nov 2018, at 14:58, ??? via petsc-users wrote: > > Hi, > > I have created a DMPlex using DMPlexCreateFromFile, > and I use PetscObjectSetName to create the name of DM. > I have occurred some problem when I try to use PetscViewerVTKGetDM. > This is what I write: > > DM dm; > ... > PetscObjectSetName((PetscObject) dm, "Mesh"); > ... > PetscViewerVTKGetDM(viewer, (PetscObject) dm); > > And here is the output of waring: > warning: passing argument 2 of ?PetscViewerVTKGetDM? from incompatible pointer type [-Wincompatible-pointer-types] > ierr = PetscViewerVTKGetDM(viewer, (PetscObject) dm);CHKERRQ(ierr); > ^ > note: expected ?struct _p_PetscObject **? but argument is of type ?struct _p_PetscObject *' > PETSC_EXTERN PetscErrorCode PetscViewerVTKGetDM(PetscViewer,PetscObject*); > ^~~~~~~~~~~~~~~~~~~ You need to pass a pointer to the PetscObject: PetscViewerVTKGetDM(viewer, (PetscObject *)&dm); Cheers, Lawrence From barrydog505 at gmail.com Fri Nov 2 12:05:37 2018 From: barrydog505 at gmail.com (=?UTF-8?B?6Zmz5a6X6IiI?=) Date: Sat, 3 Nov 2018 01:05:37 +0800 Subject: [petsc-users] How to use PetscViewerVTKGetDM In-Reply-To: References: Message-ID: I want to create a VTK file with DMPlex and Vec, and I found these function: PetscViewerVTKOpen, PetscViewerVTKGetDM, and PetscViewerVTKWriteFunction Are these what I needed, or maybe something else? thank you, Barry Matthew Knepley ? 2018?11?3? ?? ??12:14??? > On Fri, Nov 2, 2018 at 12:11 PM ??? via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Hi, >> >> I have created a DMPlex using DMPlexCreateFromFile, >> and I use PetscObjectSetName to create the name of DM. >> I have occurred some problem when I try to use PetscViewerVTKGetDM. >> This is what I write: >> >> DM dm; >> ... >> PetscObjectSetName((PetscObject) dm, "Mesh"); >> ... >> PetscViewerVTKGetDM(viewer, (PetscObject) dm); >> > > 1) GetDM() returns a DM, so you would need PetscViewerVTKGetDM(viewer, > (PetscObject *) &dm) > > 2) However, why are you trying to pull out a DM? It does not look like you > want to do this. > > Thanks, > > Matt > > >> And here is the output of waring: >> warning: passing argument 2 of ?PetscViewerVTKGetDM? from >> incompatible pointer type [-Wincompatible-pointer-types] >> ierr = PetscViewerVTKGetDM(viewer, (PetscObject) dm);CHKERRQ(ierr); >> ^ >> note: expected ?struct _p_PetscObject **? but argument is of type >> ?struct _p_PetscObject *' >> PETSC_EXTERN PetscErrorCode >> PetscViewerVTKGetDM(PetscViewer,PetscObject*); >> >> ^~~~~~~~~~~~~~~~~~~ >> >> How can I fix it? >> >> Thank you, >> >> Barry >> >> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Tristan.Konolige at Colorado.EDU Fri Nov 2 13:16:50 2018 From: Tristan.Konolige at Colorado.EDU (Tristan Konolige) Date: Fri, 2 Nov 2018 12:16:50 -0600 Subject: [petsc-users] Poor Convergence for Non-Linear Least Squares using NTR Message-ID: Hello All, I?m trying to optimize a nonlinear least squares problem using PETSc. I?d like to use?Levenberg-Marquardt, but, as it is not available, I am using Tao?s NTR with J^T J as the hessian. The NTR is converging very slowly (compared to an LM solver in Ceres ? a nonlinear optimization package). How can I debug this poor performance? Is there something I am missing for solving nonlinear least squares in PETSc? Thanks, Tristan Konolige -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 2 12:10:01 2018 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 2 Nov 2018 13:10:01 -0400 Subject: [petsc-users] How to use PetscViewerVTKGetDM In-Reply-To: References: Message-ID: On Fri, Nov 2, 2018 at 1:06 PM ??? wrote: > I want to create a VTK file with DMPlex and Vec, and I found these > function: > PetscViewerVTKOpen, > Call VTKOpen() to get a PetscViewer object > PetscViewerVTKGetDM, and PetscViewerVTKWriteFunction > Forget these. Call VecView(v, viewer) with your Vec and VTK Viewer. Thanks, Matt > Are these what I needed, or maybe something else? > > thank you, > > Barry > > Matthew Knepley ? 2018?11?3? ?? ??12:14??? > >> On Fri, Nov 2, 2018 at 12:11 PM ??? via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Hi, >>> >>> I have created a DMPlex using DMPlexCreateFromFile, >>> and I use PetscObjectSetName to create the name of DM. >>> I have occurred some problem when I try to use PetscViewerVTKGetDM. >>> This is what I write: >>> >>> DM dm; >>> ... >>> PetscObjectSetName((PetscObject) dm, "Mesh"); >>> ... >>> PetscViewerVTKGetDM(viewer, (PetscObject) dm); >>> >> >> 1) GetDM() returns a DM, so you would need PetscViewerVTKGetDM(viewer, >> (PetscObject *) &dm) >> >> 2) However, why are you trying to pull out a DM? It does not look like >> you want to do this. >> >> Thanks, >> >> Matt >> >> >>> And here is the output of waring: >>> warning: passing argument 2 of ?PetscViewerVTKGetDM? from >>> incompatible pointer type [-Wincompatible-pointer-types] >>> ierr = PetscViewerVTKGetDM(viewer, (PetscObject) dm);CHKERRQ(ierr); >>> ^ >>> note: expected ?struct _p_PetscObject **? but argument is of type >>> ?struct _p_PetscObject *' >>> PETSC_EXTERN PetscErrorCode >>> PetscViewerVTKGetDM(PetscViewer,PetscObject*); >>> >>> ^~~~~~~~~~~~~~~~~~~ >>> >>> How can I fix it? >>> >>> Thank you, >>> >>> Barry >>> >>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From barrydog505 at gmail.com Fri Nov 2 12:14:54 2018 From: barrydog505 at gmail.com (=?UTF-8?B?6Zmz5a6X6IiI?=) Date: Sat, 3 Nov 2018 01:14:54 +0800 Subject: [petsc-users] How to use PetscViewerVTKGetDM In-Reply-To: References: Message-ID: I will try it. Thanks a lot. barry Matthew Knepley ? 2018?11?3? ?? ??1:10??? > On Fri, Nov 2, 2018 at 1:06 PM ??? wrote: > >> I want to create a VTK file with DMPlex and Vec, and I found these >> function: >> PetscViewerVTKOpen, >> > > Call VTKOpen() to get a PetscViewer object > > >> PetscViewerVTKGetDM, and PetscViewerVTKWriteFunction >> > > Forget these. > > Call VecView(v, viewer) with your Vec and VTK Viewer. > > Thanks, > > Matt > > >> Are these what I needed, or maybe something else? >> >> thank you, >> >> Barry >> >> Matthew Knepley ? 2018?11?3? ?? ??12:14??? >> >>> On Fri, Nov 2, 2018 at 12:11 PM ??? via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> >>>> Hi, >>>> >>>> I have created a DMPlex using DMPlexCreateFromFile, >>>> and I use PetscObjectSetName to create the name of DM. >>>> I have occurred some problem when I try to use PetscViewerVTKGetDM. >>>> This is what I write: >>>> >>>> DM dm; >>>> ... >>>> PetscObjectSetName((PetscObject) dm, "Mesh"); >>>> ... >>>> PetscViewerVTKGetDM(viewer, (PetscObject) dm); >>>> >>> >>> 1) GetDM() returns a DM, so you would need PetscViewerVTKGetDM(viewer, >>> (PetscObject *) &dm) >>> >>> 2) However, why are you trying to pull out a DM? It does not look like >>> you want to do this. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> And here is the output of waring: >>>> warning: passing argument 2 of ?PetscViewerVTKGetDM? from >>>> incompatible pointer type [-Wincompatible-pointer-types] >>>> ierr = PetscViewerVTKGetDM(viewer, (PetscObject) dm);CHKERRQ(ierr); >>>> ^ >>>> note: expected ?struct _p_PetscObject **? but argument is of type >>>> ?struct _p_PetscObject *' >>>> PETSC_EXTERN PetscErrorCode >>>> PetscViewerVTKGetDM(PetscViewer,PetscObject*); >>>> >>>> ^~~~~~~~~~~~~~~~~~~ >>>> >>>> How can I fix it? >>>> >>>> Thank you, >>>> >>>> Barry >>>> >>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Moritz.Huck at isea.rwth-aachen.de Sat Nov 3 04:56:23 2018 From: Moritz.Huck at isea.rwth-aachen.de (Moritz.Huck at isea.rwth-aachen.de) Date: Sat, 3 Nov 2018 09:56:23 +0000 Subject: [petsc-users] Solution "jumps" after setting timestep Message-ID: Hi, I am using the IMEX Runge Kutta to solve an implicit DAE (G=0). When I manually set the time step (TSSetTimestep) my solution jumps. If I go from a steady state to the same state, it oscilates a few time and comes back to the steady solution. Can this be prevented? Thank you, Moritz -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 7669 bytes Desc: not available URL: From emconsta at anl.gov Sat Nov 3 09:42:58 2018 From: emconsta at anl.gov (Constantinescu, Emil M.) Date: Sat, 3 Nov 2018 14:42:58 +0000 Subject: [petsc-users] Solution "jumps" after setting timestep In-Reply-To: References: Message-ID: <73f49206-20b7-8fb1-5f41-6e920cdae7eb@anl.gov> On 11/3/18 4:56 AM, Moritz.Huck--- via petsc-users wrote: > Hi, > > I am using the IMEX Runge Kutta to solve an implicit DAE (G=0). > > When I manually set the time step (TSSetTimestep) my solution jumps. When you don't set the time step, does it still oscillates? If not, can you check if it oscillates when reducing the fixed initial time step? Did you turn the adaptivity off (-ts_adapt_type none)? Emil > If I go from ?a steady state to the ?same state, it oscilates a few time > and comes back to the steady solution. > > Can this be prevented? > > Thank you, > > Moritz > From jed at jedbrown.org Sat Nov 3 09:53:59 2018 From: jed at jedbrown.org (Jed Brown) Date: Sat, 03 Nov 2018 08:53:59 -0600 Subject: [petsc-users] Solution "jumps" after setting timestep In-Reply-To: References: Message-ID: <87muqqfbzs.fsf@jedbrown.org> What is the index of your DAE? Can you show us code that exhibits this behavior? "Moritz.Huck--- via petsc-users" writes: > Hi, > > I am using the IMEX Runge Kutta to solve an implicit DAE (G=0). > > When I manually set the time step (TSSetTimestep) my solution jumps. > > If I go from a steady state to the same state, it oscilates a few time and > comes back to the steady solution. > > > > Can this be prevented? > > > > Thank you, > > Moritz > > > From Moritz.Huck at isea.rwth-aachen.de Sun Nov 4 06:19:10 2018 From: Moritz.Huck at isea.rwth-aachen.de (Moritz.Huck at isea.rwth-aachen.de) Date: Sun, 4 Nov 2018 12:19:10 +0000 Subject: [petsc-users] Solution "jumps" after setting timestep In-Reply-To: <73f49206-20b7-8fb1-5f41-6e920cdae7eb@anl.gov> References: <73f49206-20b7-8fb1-5f41-6e920cdae7eb@anl.gov> Message-ID: Hi, I using the basic adaptor. If I let the adaptor handle everything it does not oscillates. I don?t mean setting the timestep at the start (which does not produces the problem), I need to set during runtime between two steps. Best Regards, Moritz -----Urspr?ngliche Nachricht----- Von: Constantinescu, Emil M. Gesendet: Samstag, 3. November 2018 15:43 An: Huck, Moritz ; petsc-users at mcs.anl.gov Betreff: Re: [petsc-users] Solution "jumps" after setting timestep On 11/3/18 4:56 AM, Moritz.Huck--- via petsc-users wrote: > Hi, > > I am using the IMEX Runge Kutta to solve an implicit DAE (G=0). > > When I manually set the time step (TSSetTimestep) my solution jumps. When you don't set the time step, does it still oscillates? If not, can you check if it oscillates when reducing the fixed initial time step? Did you turn the adaptivity off (-ts_adapt_type none)? Emil > If I go from ?a steady state to the ?same state, it oscillates a few > time and comes back to the steady solution. > > Can this be prevented? > > Thank you, > > Moritz > -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 7669 bytes Desc: not available URL: From yjwu16 at gmail.com Sun Nov 4 08:45:08 2018 From: yjwu16 at gmail.com (Yingjie Wu) Date: Sun, 4 Nov 2018 22:45:08 +0800 Subject: [petsc-users] Some Problems about SNES EX19.c Message-ID: Dear Petsc developer: Hi, Recently, I am very interested in the ex19 example in SNES, which uses NGS method to solve the non-linear equations, which may be the method I need to use in the future. I have some doubts about the program. 1. 82: typedef struct { 83: PetscScalar u,v,omega,temp; 84: } Field 86: PetscErrorCode FormFunctionLocal(DMDALocalInfo*,Field**,Field**,void*); ? 150: DMDASNESSetFunctionLocal(da,INSERT_VALUES,(PetscErrorCode (*)(DMDALocalInfo*,void*,void*,void*))FormFunctionLocal,&user); I looked at PETSc manualpage: For PetscErrorCode (*func)(DMDALocalInfo *info,void *x, void *f, void *ctx), info - DMDALocalInfo defining the subdomain to evaluate the residual on x - dimensional pointer to state at which to evaluate residual (e.g. PetscScalar *x or **x or ***x) f - dimensional pointer to residual, write the residual here (e.g. PetscScalar *f or **f or ***f) ctx - optional context passed above In the function FormFunctionLocal, the second and third parameters should be pointers to PetscScalar, where pointers are directly used to point to Field. Why can we use them here? Although I know that there are four degrees of freedom in DM objects, how can I ensure that the program correctly corresponds to variables in Field? 2. In the NGS subroutine, 530: dfudu = 2.0* (hydhx + hxdhy); But in the residual function: 526: u = x[j][i].u; 527: uxx = (2.0*u - x[j][i-1].u - x[j][i+1].u) *hydhx; 528: uyy = (2.0*u - x[j-1][i].u - x[j+1][i].u) *hxdhy; 529: Fu = uxx + uyy -.5* (x[j+1][i].omega-x[j-1][i].omega) *hx - bjiu; / * invert the system: 572: [ dfu / du 0 0 0 ][yu] = [fu] 573: [ 0 dfv / dv 0 0 ][yv] [fv] 574: [ dfo / du dfo / dv dfo / do 0 ][yo] [fo] 575: [ dft / du dft / dv 0 dft / dt ][yt] [ft] 576: by simple back-substitution 577: * / It is known that the residual function fu [j] [i] is a function of five variables (x [j] [i].u, x [j] [i-1].u, x [j] [i+1].u, x [j-1] [i].u, x [j+1] [i] [i].u) (it is same for analytic Jacobian matrix). But in this program, only the central grid is used to solve the partial derivatives. Why do we choose to do so? In my understanding, the sub-matrix 'dfudu' is a five-diagonal matrix, but it is processed into a diagonal matrix in the program. Will this affect the accuracy of the solution? Thanks for your continuous help, Yingjie -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sun Nov 4 09:31:07 2018 From: jed at jedbrown.org (Jed Brown) Date: Sun, 04 Nov 2018 08:31:07 -0700 Subject: [petsc-users] Some Problems about SNES EX19.c In-Reply-To: References: Message-ID: <875zxcg8qs.fsf@jedbrown.org> Yingjie Wu via petsc-users writes: > Dear Petsc developer: > Hi, > Recently, I am very interested in the ex19 example in SNES, which uses NGS > method to solve the non-linear equations, which may be the method I need to > use in the future. I have some doubts about the program. > 1. > > 82: typedef struct { > 83: PetscScalar u,v,omega,temp; > 84: } Field > 86: PetscErrorCode FormFunctionLocal(DMDALocalInfo*,Field**,Field**,void*); > ? > 150: DMDASNESSetFunctionLocal(da,INSERT_VALUES,(PetscErrorCode > (*)(DMDALocalInfo*,void*,void*,void*))FormFunctionLocal,&user); > > > I looked at PETSc manualpage: > > For PetscErrorCode (*func)(DMDALocalInfo *info,void *x, void *f, void *ctx), > info - DMDALocalInfo defining the subdomain to evaluate the residual on > x - dimensional pointer to state at which to evaluate residual (e.g. > PetscScalar *x or **x or ***x) > f - dimensional pointer to residual, write the residual here (e.g. > PetscScalar *f or **f or ***f) > ctx - optional context passed above > > In the function FormFunctionLocal, the second and third parameters should > be pointers to PetscScalar, where pointers are directly used to point to > Field. Why can we use them here? Although I know that there are > four degrees of freedom in DM objects, how can I ensure that the program > correctly corresponds to variables in Field? DMDA "interlaces" the fields, so this memory layout is guaranteed. > 2. > In the NGS subroutine, > > 530: dfudu = 2.0* (hydhx + hxdhy); > > But in the residual function: > > 526: u = x[j][i].u; > 527: uxx = (2.0*u - x[j][i-1].u - x[j][i+1].u) *hydhx; > 528: uyy = (2.0*u - x[j-1][i].u - x[j+1][i].u) *hxdhy; > 529: Fu = uxx + uyy -.5* (x[j+1][i].omega-x[j-1][i].omega) *hx - bjiu; > > / * invert the system: > 572: [ dfu / du 0 0 0 ][yu] = [fu] > 573: [ 0 dfv / dv 0 0 ][yv] [fv] > 574: [ dfo / du dfo / dv dfo / do 0 ][yo] [fo] > 575: [ dft / du dft / dv 0 dft / dt ][yt] [ft] > 576: by simple back-substitution > 577: * / > > It is known that the residual function fu [j] [i] is a function of five > variables (x [j] [i].u, x [j] [i-1].u, x [j] [i+1].u, x [j-1] [i].u, x > [j+1] [i] [i].u) (it is same for analytic Jacobian matrix). But in this > program, only the central grid is used to solve the partial derivatives. > Why do we choose to do so? In my understanding, the sub-matrix > 'dfudu' is a five-diagonal matrix, but it is processed into a diagonal > matrix in the program. Will this affect the accuracy of the solution? This may be a misunderstanding. The residual calculates a vector as a nonlinear function of the input. The Jacobian matrix is not assembled in this example, but can be computed efficiently by finite differencing using coloring. Each row has 5*4=20 formal nonzeros and computing the matrix by coloring requires somewhere around that number of residual evaluations. From bsmith at mcs.anl.gov Sun Nov 4 10:58:23 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Sun, 4 Nov 2018 16:58:23 +0000 Subject: [petsc-users] Some Problems about SNES EX19.c In-Reply-To: References: Message-ID: <3AFA87D5-A78E-453F-B5AC-79162FFE4B44@anl.gov> > On Nov 4, 2018, at 8:45 AM, Yingjie Wu via petsc-users wrote: > > Dear Petsc developer: > Hi, > Recently, I am very interested in the ex19 example in SNES, which uses NGS method to solve the non-linear equations, which may be the method I need to use in the future. I have some doubts about the program. > 1. > 82: typedef struct { > 83: PetscScalar u,v,omega,temp; > 84: } Field > 86: PetscErrorCode FormFunctionLocal(DMDALocalInfo*,Field**,Field**,void*); > ? > 150: DMDASNESSetFunctionLocal(da,INSERT_VALUES,(PetscErrorCode (*)(DMDALocalInfo*,void*,void*,void*))FormFunctionLocal,&user); > > I looked at PETSc manualpage: > For PetscErrorCode (*func)(DMDALocalInfo *info,void *x, void *f, void *ctx), > info - DMDALocalInfo defining the subdomain to evaluate the residual on > x - dimensional pointer to state at which to evaluate residual (e.g. PetscScalar *x or **x or ***x) > f - dimensional pointer to residual, write the residual here (e.g. PetscScalar *f or **f or ***f) > ctx - optional context passed above > In the function FormFunctionLocal, the second and third parameters should be pointers to PetscScalar, where pointers are directly used to point to Field. Why can we use them here? Although I know that there are four degrees of freedom in DM objects, how can I ensure that the program correctly corresponds to variables in Field? > > 2. > In the NGS subroutine, > 530: dfudu = 2.0* (hydhx + hxdhy); > But in the residual function: > 526: u = x[j][i].u; > 527: uxx = (2.0*u - x[j][i-1].u - x[j][i+1].u) *hydhx; > 528: uyy = (2.0*u - x[j-1][i].u - x[j+1][i].u) *hxdhy; > 529: Fu = uxx + uyy -.5* (x[j+1][i].omega-x[j-1][i].omega) *hx - bjiu; > / * invert the system: > 572: [ dfu / du 0 0 0 ][yu] = [fu] > 573: [ 0 dfv / dv 0 0 ][yv] [fv] > 574: [ dfo / du dfo / dv dfo / do 0 ][yo] [fo] > 575: [ dft / du dft / dv 0 dft / dt ][yt] [ft] > 576: by simple back-substitution > 577: * / > It is known that the residual function fu [j] [i] is a function of five variables (x [j] [i].u, x [j] [i-1].u, x [j] [i+1].u, x [j-1] [i].u, x [j+1] [i] [i].u) (it is same for analytic Jacobian matrix). But in this program, only the central grid is used to solve the partial derivatives. Why do we choose to do so? In my understanding, the sub-matrix 'dfudu' is a five-diagonal matrix, but it is processed into a diagonal matrix in the program. Will this affect the accuracy of the solution? NonlinearGS() does iterations of point-block nonlinear Gauss-Seidel (which is used as the smoother for nonlinear multigrid). Thus at each grid point it does Newton's method (this is the inner loop for (l = 0; l < max_its && !ptconverged; l++) {). In nonlinear Gauss-Seidel smoothing only the diagonal block of the Jacobian is needed, the off diagonal block are never needed. Barry > > Thanks for your continuous help, > Yingjie From tempohoper at gmail.com Mon Nov 5 03:46:32 2018 From: tempohoper at gmail.com (Sal Am) Date: Mon, 5 Nov 2018 09:46:32 +0000 Subject: [petsc-users] Vec, Mat and binaryfiles. Message-ID: Hi, I am trying to solve a Ax=b complex system. the vector b and "matrix" A are both binary and NOT created by PETSc. So I keep getting error messages that they are not correct format when I read the files with PetscViewBinaryOpen, after some digging it seems that one cannot just read a binary file that was created by another software. How would I go on to solve this problem? More info and trials: "matrix" A consists of two files, one that contains row column index numbers and one that contains the non-zero values. So what I would have to do is multiply the last term in a+b with PETSC_i to get a real + imaginary vector A. vector b is in binary, so what I have done so far (not sure if it works) is: std::ifstream input("Vector_b.bin", std::ios::binary ); while (input.read(reinterpret_cast(&v), sizeof(float))) ierr = VecSetValues(u,1,&iglobal,&v,INSERT_VALUES);CHKERRQ(ierr); where v is a PetscScalar. Once I am able to read both matrices I think I can figure out the solvers to solve the system. All the best, S -------------- next part -------------- An HTML attachment was scrubbed... URL: From kandanovian at gmail.com Mon Nov 5 07:29:00 2018 From: kandanovian at gmail.com (Tim Steinhoff) Date: Mon, 5 Nov 2018 14:29:00 +0100 Subject: [petsc-users] PetscViewerFileSetName() wrong error code in Fortran Message-ID: Dear PETSc Team, I am having the issue that petscviewerfilesetname_() and petscviewerfilegetname_() in src\sys\classes\viewer\impls\ascii\ftn-custom\zfilevf.c (Fortran) do not return proprer error codes, as those are not checked immediately after the actual call and will be overwritten afterwards within the macro FREECHAR() or by other subsequent calls respectively. It would be very helpful if you could fix that. Thanks and king regards, Volker From inz.karolewandowski at gmail.com Mon Nov 5 07:40:14 2018 From: inz.karolewandowski at gmail.com (Karol Lewandowski) Date: Mon, 5 Nov 2018 13:40:14 +0000 Subject: [petsc-users] Force SNES diverge Message-ID: <98E27DD4-6CAA-4A97-9668-7A796957B9EB@gmail.com> Hi, I am solving a highly nonlinear problem using SNES solver. Under certain conditions during the iterations I already know that the step will diverge (in the next few iterations). Is there a clever way to tell SNES to diverge immediately so I can start a new step with reduced step size? Thank you, Karol From yjwu16 at gmail.com Mon Nov 5 08:59:31 2018 From: yjwu16 at gmail.com (Yingjie Wu) Date: Mon, 5 Nov 2018 22:59:31 +0800 Subject: [petsc-users] Problems about Assemble DMComposite Precondition Matrix Message-ID: Dear Petsc developer: Hi, I have recently studied the preconditioner of the program, and some problems have arisen. Please help me to solve them. At present, I have written a program to solve the system of non-linear equations. The Matrix Free method has been used to calculate results. But I want to add a preprocessing matrix to it. I used the DMComposite object, which stores two sub-DM objects and a single value (two physical field variables and one variable). I want to use MatGetLocalSubMatrix to assign each physical field sub precondition matrix. At the same time, because my DM object is two-dimensional, I use MatSetValuesStencil() to fill the sub matrix. At present, I just want to fill in a unit matrix (for global vectors) as the precondition matrix of Matrix Free Method (just to test whether it can be used, the unit matrix has no preprocessing effect). But the procedure was wrong. yjwu at yjwu-XPS-8910:~/petsc-3.10.1/src/snes/examples/tutorials$ mpiexec -n 1 ./ex216 -f wu-readtwogroups -snes_mf_operator -snes_view -snes_converged_reason -snes_monitor -ksp_converged_reason -ksp_monitor_true_residual 0 SNES Function norm 8.235090086536e-02 iter = 0, SNES Function norm 0.0823509 iter = 0, Keff ======= 1. Linear solve did not converge due to DIVERGED_PCSETUP_FAILED iterations 0 PCSETUP_FAILED due to FACTOR_NUMERIC_ZEROPIVOT Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 SNES Object: 1 MPI processes type: newtonls maximum iterations=50, maximum function evaluations=10000 tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 total number of linear solver iterations=0 total number of function evaluations=1 norm schedule ALWAYS SNESLineSearch Object: 1 MPI processes type: bt interpolation: cubic alpha=1.000000e-04 maxstep=1.000000e+08, minlambda=1.000000e-12 tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08 maximum iterations=40 KSP Object: 1 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: ilu out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=961, cols=961 package used to perform factorization: petsc total: nonzeros=4625, allocated nonzeros=4625 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix followed by preconditioner matrix: Mat Object: 1 MPI processes type: mffd rows=961, cols=961 Matrix-free approximation: err=1.49012e-08 (relative error in function evaluation) The compute h routine has not yet been set Mat Object: 1 MPI processes type: seqaij rows=961, cols=961 total: nonzeros=4625, allocated nonzeros=4625 total number of mallocs used during MatSetValues calls =0 not using I-node routines It seems that there are elements on the diagonal line that are not filled, but I don't understand what went wrong. Part of the code of my program is as follows. I added all the code to the appendix. I have tested that for the entire Jacobian matrix assignment unit matrix (no submatrix, direct operation of the global preprocessing matrix), the program can run normally. However, since the problems developed later may be more complex, involving multiple physical fields, it would be easier to implement submatrix. ...... /* set DMComposite */ ierr = DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da1);CHKERRQ(ierr); ierr = DMSetFromOptions(user.da1);CHKERRQ(ierr); ierr = DMSetUp(user.da1);CHKERRQ(ierr); ierr = DMCompositeAddDM(user.packer,user.da1);CHKERRQ(ierr); ierr = DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da2);CHKERRQ(ierr); ierr = DMSetFromOptions(user.da2);CHKERRQ(ierr); ierr = DMSetUp(user.da2);CHKERRQ(ierr); ierr = DMCompositeAddDM(user.packer,user.da2);CHKERRQ(ierr); ierr = DMRedundantCreate(PETSC_COMM_WORLD,0,1,&user.red1);CHKERRQ(ierr); ierr = DMCompositeAddDM(user.packer,user.red1);CHKERRQ(ierr); ...... /* in FormJacobian(SNES snes,Vec U,Mat J,Mat B,void *ctx) */ ierr = DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); ierr = DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); ierr = DMCompositeGetLocalISs(user->packer,&is);CHKERRQ(ierr); ierr = MatGetLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); ierr = MatGetLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); for (j=ys; jda1,vphi1,&phi1);CHKERRQ(ierr); ierr = DMDAVecRestoreArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); ...... Thanks, Yingjie -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- static const char help[] = "Solves PDE optimization problem using full-space method, treats state and adjoint variables separately.\n\n"; /*T requires: !single T*/ #include #include #include #include #include #include /* w - design variables (what we change to get an optimal solution) u - state variables (i.e. the PDE solution) lambda - the Lagrange multipliers U = (w u lambda) fu, fw, flambda contain the gradient of L(w,u,lambda) FU = (fw fu flambda) In this example the PDE is Uxx = 2 u(0) = w(0), thus this is the free parameter u(1) = 0 the function we wish to minimize is \integral u^{2} The exact solution for u is given by u(x) = x*x - 1.25*x + .25 Use the usual centered finite differences. Note we treat the problem as non-linear though it happens to be linear See ex22.c for the same code, but that interlaces the u and the lambda */ typedef struct { DM red1,da1,da2; DM packer; } UserCtx; extern PetscErrorCode FormInitialGuess(UserCtx*,Vec,Mat,Mat); extern PetscErrorCode FormFunction(SNES,Vec,Vec,void*); extern PetscErrorCode FormJacobian(SNES,Vec,Mat,Mat,void*); extern PetscErrorCode Monitor(SNES,PetscInt,PetscReal,void*); int main(int argc,char **argv) { PetscErrorCode ierr; PetscInt its; Vec U,FU,vlambda,vphi1,vphi2; Mat A1,A2;/*Matrix for storage initial guess*/ Mat B; SNES snes; UserCtx user; PetscViewer viewer; char file[PETSC_MAX_PATH_LEN];/* input file name */ PetscBool flg; ierr = PetscInitialize(&argc,&argv,(char*)0,help);if (ierr) return ierr; /*Read the file and store them in the matrix*/ ierr = PetscOptionsGetString(NULL,NULL,"-f",file,PETSC_MAX_PATH_LEN,&flg);CHKERRQ(ierr); if (!flg) { SETERRQ(PETSC_COMM_WORLD,1,"Must indicate binary file with the -f option"); } ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&viewer);CHKERRQ(ierr); ierr = MatCreate(PETSC_COMM_WORLD,&A1);CHKERRQ(ierr); ierr = MatSetFromOptions(A1);CHKERRQ(ierr); ierr = MatCreate(PETSC_COMM_WORLD,&A2);CHKERRQ(ierr); ierr = MatSetFromOptions(A2);CHKERRQ(ierr); ierr = MatLoad(A1,viewer);CHKERRQ(ierr); ierr = MatLoad(A2,viewer);CHKERRQ(ierr); ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); /* Create a global vector that includes a single redundant array and two da arrays */ ierr = DMCompositeCreate(PETSC_COMM_WORLD,&user.packer);CHKERRQ(ierr); /* Set neutron flux DM */ ierr = DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da1);CHKERRQ(ierr); ierr = DMSetFromOptions(user.da1);CHKERRQ(ierr); ierr = DMSetUp(user.da1);CHKERRQ(ierr); ierr = DMCompositeAddDM(user.packer,user.da1);CHKERRQ(ierr); ierr = DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da2);CHKERRQ(ierr); ierr = DMSetFromOptions(user.da2);CHKERRQ(ierr); ierr = DMSetUp(user.da2);CHKERRQ(ierr); ierr = DMCompositeAddDM(user.packer,user.da2);CHKERRQ(ierr); ierr = DMRedundantCreate(PETSC_COMM_WORLD,0,1,&user.red1);CHKERRQ(ierr); ierr = DMCompositeAddDM(user.packer,user.red1);CHKERRQ(ierr); ierr = DMDASetFieldName(user.da1,0,"phi1");CHKERRQ(ierr); ierr = DMDASetFieldName(user.da2,0,"phi2");CHKERRQ(ierr); //ierr = DMDASetFieldName(user.red1,0,"lambda");CHKERRQ(ierr); ierr = DMCreateGlobalVector(user.packer,&U);CHKERRQ(ierr); ierr = VecDuplicate(U,&FU);CHKERRQ(ierr); /* Form initial Guess lambda, phi1, phi2 */ ierr = FormInitialGuess(&user,U,A1,A2);CHKERRQ(ierr); /* create nonlinear solver */ ierr = SNESCreate(PETSC_COMM_WORLD,&snes);CHKERRQ(ierr); ierr = DMCreateMatrix(user.packer,&B);CHKERRQ(ierr); /* This example does not correctly allocate off-diagonal blocks. These options allows new nonzeros (slow). */ ierr = MatSetOption(B,MAT_NEW_NONZERO_LOCATION_ERR,PETSC_FALSE);CHKERRQ(ierr); ierr = MatSetOption(B,MAT_NEW_NONZERO_ALLOCATION_ERR,PETSC_FALSE);CHKERRQ(ierr); ierr = SNESSetFunction(snes,FU,FormFunction,&user);CHKERRQ(ierr); ierr = SNESSetJacobian(snes,NULL,B,FormJacobian,&user);CHKERRQ(ierr); ierr = SNESSetFromOptions(snes);CHKERRQ(ierr); ierr = SNESMonitorSet(snes,Monitor,&user,0);CHKERRQ(ierr); ierr = SNESSetDM(snes,user.packer);CHKERRQ(ierr); ierr = SNESSolve(snes,NULL,U);CHKERRQ(ierr); ierr = SNESGetIterationNumber(snes,&its);CHKERRQ(ierr); /* Output the data */ ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"wu_testtwogroups",FILE_MODE_WRITE,&viewer);CHKERRQ(ierr); ierr = PetscViewerPushFormat(viewer,PETSC_VIEWER_BINARY_MATLAB);CHKERRQ(ierr); ierr = DMCompositeGetLocalVectors(user.packer,&vlambda,&vphi1,&vphi2);CHKERRQ(ierr); ierr = DMCompositeScatter(user.packer,U,vlambda,vphi1,vphi2);CHKERRQ(ierr); ierr = VecView(vlambda,viewer);CHKERRQ(ierr); ierr = VecView(vphi1,viewer);CHKERRQ(ierr); ierr = VecView(vphi2,viewer);CHKERRQ(ierr); ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); ierr = SNESDestroy(&snes);CHKERRQ(ierr); ierr = DMDestroy(&user.red1);CHKERRQ(ierr); ierr = DMDestroy(&user.da1);CHKERRQ(ierr); ierr = DMDestroy(&user.da2);CHKERRQ(ierr); ierr = DMDestroy(&user.packer);CHKERRQ(ierr); ierr = VecDestroy(&U);CHKERRQ(ierr); ierr = VecDestroy(&FU);CHKERRQ(ierr); ierr = VecDestroy(&vlambda);CHKERRQ(ierr); ierr = VecDestroy(&vphi1);CHKERRQ(ierr); ierr = VecDestroy(&vphi2);CHKERRQ(ierr); ierr = PetscFinalize(); return ierr; } PetscErrorCode FormInitialGuess(UserCtx *user,Vec U,Mat A1,Mat A2) { PetscErrorCode ierr; PetscInt xs,xm,i,N; PetscInt ys,ym,j,M;/*y direction index*/ PetscInt id; PetscScalar *lambda,**phi1,**phi2; PetscScalar *initial_phi1,*initial_phi2; Vec vlambda,vphi1,vphi2; PetscFunctionBeginUser; ierr = MatSeqAIJGetArray(A1,&initial_phi1);CHKERRQ(ierr); ierr = MatSeqAIJGetArray(A2,&initial_phi2);CHKERRQ(ierr); ierr = DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); ierr = DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); ierr = DMDAGetCorners(user->da1,&xs,&ys,NULL,&xm,&ym,NULL);CHKERRQ(ierr);//get two direction index ierr = DMDAGetInfo(user->da1,0,&N,&M,0,0,0,0,0,0,0,0,0,0);CHKERRQ(ierr); ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); if (xs == 0 && ys == 0) lambda[0] = 1.0; for ( j=ys ; j < ys+ym ; j++ ){ for ( i=xs ; i < xs+xm ; i++ ){ id = j*xm + i ; phi1[j][i] = initial_phi1[id] ; phi2[j][i] = initial_phi2[id] ; } } ierr = VecRestoreArray(vlambda,&lambda);CHKERRQ(ierr); ierr = DMDAVecRestoreArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); ierr = DMDAVecRestoreArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); ierr = DMCompositeGather(user->packer,INSERT_VALUES,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); ierr = DMCompositeRestoreLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); PetscFunctionReturn(0); } /* Evaluates FU = Gradiant(L(w,u,lambda)) */ PetscErrorCode FormFunction(SNES snes,Vec U,Vec FU,void *dummy) { UserCtx *user = (UserCtx*)dummy; PetscErrorCode ierr; PetscInt xs,xm,i,N; PetscInt ys,ym,j,M;/*y direction index*/ PetscInt xints,xinte,yints,yinte; PetscScalar *lambda,**phi1,**phi2,*flambda,**fphi1,**fphi2; PetscScalar dx,dy; PetscScalar dotphi;//wu-define PetscScalar rightside,e,b,c,dd,a; PetscReal phi1boundary,phi2boundary; Vec vlambda,vphi1,vphi2,vflambda,vfphi1,vfphi2; PetscFunctionBeginUser; ierr = DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); ierr = DMCompositeGetLocalVectors(user->packer,&vfphi1,&vfphi2,&vflambda);CHKERRQ(ierr); ierr = DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); ierr = DMDAGetCorners(user->da1,&xs,&ys,NULL,&xm,&ym,NULL);CHKERRQ(ierr);//get two direction index ierr = DMDAGetInfo(user->da1,0,&N,&M,0,0,0,0,0,0,0,0,0,0);CHKERRQ(ierr); ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr); ierr = VecGetArray(vflambda,&flambda);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da1,vfphi1,&fphi1);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da2,vfphi2,&fphi2);CHKERRQ(ierr); dx = 5.0; dy = 5.0; phi1boundary = 0.0; phi2boundary = 0.0; /* residual f_lambda */ if (xs == 0 && ys == 0 ) { /* only first processor computes this */ dotphi = 0.0; for ( j=ys ; j < ys+ym ; j++ ){ for ( i=xs ; i < xs+xm ; i++ ){ dotphi += phi1[j][i]*phi1[j][i] + phi2[j][i]*phi2[j][i] ; } } flambda[0] = dotphi - 1.0 ; } /* bottom boundary */ if (ys == 0) { j = 0; yints = ys + 1; /* bottom edge */ for (i=xs; ida1,vphi1,&phi1);CHKERRQ(ierr); ierr = DMDAVecRestoreArray(user->da1,vfphi1,&fphi1);CHKERRQ(ierr); ierr = DMDAVecRestoreArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); ierr = DMDAVecRestoreArray(user->da2,vfphi2,&fphi2);CHKERRQ(ierr); ierr = DMCompositeGather(user->packer,INSERT_VALUES,FU,vfphi1,vfphi2,vflambda);CHKERRQ(ierr); ierr = DMCompositeRestoreLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); ierr = DMCompositeRestoreLocalVectors(user->packer,&vfphi1,&vfphi2,&vflambda);CHKERRQ(ierr); PetscFunctionReturn(0); } PetscErrorCode FormJacobian(SNES snes,Vec U,Mat J,Mat B,void *ctx) { UserCtx *user = (UserCtx*)ctx; PetscErrorCode ierr; PetscInt xs,xm,i,N; PetscInt ys,ym,j,M; PetscInt row1,col1; MatStencil col[5],row; Mat Bll,B11,B22,B12,B21; IS *is; PetscScalar *lambda,**phi1,**phi2; PetscScalar unit; Vec vlambda,vphi1,vphi2; PetscFunctionBeginUser; // unit = 1.0; // row1 = 0; // for (int i = 0; i < 961; i++) // { // ierr = MatSetValues(B,1,&row1,1,&row1,&unit,INSERT_VALUES);CHKERRQ(ierr); // row1++; // } ierr = DMDAGetCorners(user->da1,&xs,&ys,NULL,&xm,&ym,NULL);CHKERRQ(ierr);//get two direction index ierr = DMDAGetInfo(user->da1,0,&N,&M,0,0,0,0,0,0,0,0,0,0);CHKERRQ(ierr); ierr = DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); ierr = DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); ierr = DMDAVecGetArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); ierr = DMCompositeGetLocalISs(user->packer,&is);CHKERRQ(ierr); ierr = MatGetLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); ierr = MatGetLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); // ierr = MatGetLocalSubMatrix(B,is[2],is[2],&Bll);CHKERRQ(ierr); for (j=ys; jda1,vphi1,&phi1);CHKERRQ(ierr); ierr = DMDAVecRestoreArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); ierr = ISDestroy(&is[0]);CHKERRQ(ierr); ierr = ISDestroy(&is[1]);CHKERRQ(ierr); ierr = ISDestroy(&is[2]);CHKERRQ(ierr); ierr = PetscFree(is);CHKERRQ(ierr); ierr = DMCompositeRestoreLocalVectors(user->packer,&vlambda,&vphi1,&vphi2);CHKERRQ(ierr); ierr = MatAssemblyBegin(B,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); ierr = MatAssemblyEnd (B,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); if (J != B) { ierr = MatAssemblyBegin(J,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); ierr = MatAssemblyEnd (J,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); } PetscFunctionReturn(0); } PetscErrorCode Monitor(SNES snes,PetscInt its,PetscReal fnorm,void *dummy) { UserCtx *user = (UserCtx*)dummy; PetscErrorCode ierr; Vec vlambda,vphi1,vphi2,U; PetscReal *lambda; PetscReal keff; PetscFunctionBeginUser; ierr = PetscPrintf(PETSC_COMM_WORLD,"iter = %D, SNES Function norm %g\n",its,(double)fnorm);CHKERRQ(ierr); ierr = SNESGetSolution(snes,&U);CHKERRQ(ierr); ierr = DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); ierr = DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr); keff = 1.0 / lambda[0] ; ierr = VecRestoreArray(vlambda,&lambda);CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_WORLD,"iter = %D, Keff ======= %g\n",its,(double)keff);CHKERRQ(ierr); //ierr = VecView(vphi1,PETSC_VIEWER_DRAW_WORLD);CHKERRQ(ierr); PetscFunctionReturn(0); } -------------- next part -------------- A non-text attachment was scrubbed... Name: log Type: application/octet-stream Size: 2556 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: wu-readtwogroups Type: application/octet-stream Size: 12000 bytes Desc: not available URL: From mfadams at lbl.gov Mon Nov 5 09:22:29 2018 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 5 Nov 2018 10:22:29 -0500 Subject: [petsc-users] Problems about Assemble DMComposite Precondition Matrix In-Reply-To: References: Message-ID: DMComposite is not very mature, the last time I checked and I don't of anyone having worked on it recently, and it is probably not what you want anyway. FieldSplit is most likely what you want. What are your equations and discretization? eg, Stokes with cell centered pressure? There are probably examples that are close to what you want and it would not be hard to move your code over. Mark On Mon, Nov 5, 2018 at 10:00 AM Yingjie Wu via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear Petsc developer: > Hi, > I have recently studied the preconditioner of the program, and some > problems have arisen. Please help me to solve them. > At present, I have written a program to solve the system of non-linear > equations. The Matrix Free method has been used to calculate results. But I > want to add a preprocessing matrix to it. > I used the DMComposite object, which stores two sub-DM objects and a > single value (two physical field variables and one variable). I want to use > MatGetLocalSubMatrix to assign each physical field sub precondition matrix. > At the same time, because my DM object is two-dimensional, I use > MatSetValuesStencil() to fill the sub matrix. > At present, I just want to fill in a unit matrix (for global vectors) as > the precondition matrix of Matrix Free Method (just to test whether it can > be used, the unit matrix has no preprocessing effect). But the procedure > was wrong. > > yjwu at yjwu-XPS-8910:~/petsc-3.10.1/src/snes/examples/tutorials$ > mpiexec -n 1 ./ex216 -f wu-readtwogroups -snes_mf_operator -snes_view > -snes_converged_reason -snes_monitor -ksp_converged_reason > -ksp_monitor_true_residual > > 0 SNES Function norm 8.235090086536e-02 > iter = 0, SNES Function norm 0.0823509 > iter = 0, Keff ======= 1. > Linear solve did not converge due to DIVERGED_PCSETUP_FAILED iterations 0 > PCSETUP_FAILED due to FACTOR_NUMERIC_ZEROPIVOT > Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 > SNES Object: 1 MPI processes > type: newtonls > maximum iterations=50, maximum function evaluations=10000 > tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 > total number of linear solver iterations=0 > total number of function evaluations=1 > norm schedule ALWAYS > SNESLineSearch Object: 1 MPI processes > type: bt > interpolation: cubic > alpha=1.000000e-04 > maxstep=1.000000e+08, minlambda=1.000000e-12 > tolerances: relative=1.000000e-08, absolute=1.000000e-15, > lambda=1.000000e-08 > maximum iterations=40 > KSP Object: 1 MPI processes > type: gmres > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: ilu > out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=961, cols=961 > package used to perform factorization: petsc > total: nonzeros=4625, allocated nonzeros=4625 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix followed by preconditioner matrix: > Mat Object: 1 MPI processes > type: mffd > rows=961, cols=961 > Matrix-free approximation: > err=1.49012e-08 (relative error in function evaluation) > The compute h routine has not yet been set > Mat Object: 1 MPI processes > type: seqaij > rows=961, cols=961 > total: nonzeros=4625, allocated nonzeros=4625 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > > It seems that there are elements on the diagonal line that are not filled, > but I don't understand what went wrong. Part of the code of my program is > as follows. I added all the code to the appendix. I have tested that for > the entire Jacobian matrix assignment unit matrix (no submatrix, direct > operation of the global preprocessing matrix), the program can run > normally. However, since the problems developed later may be more complex, > involving multiple physical fields, it would be easier to implement > submatrix. > > ...... > > /* set DMComposite */ > > ierr = > DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da1);CHKERRQ(ierr); > ierr = DMSetFromOptions(user.da1);CHKERRQ(ierr); > ierr = DMSetUp(user.da1);CHKERRQ(ierr); > ierr = DMCompositeAddDM(user.packer,user.da1);CHKERRQ(ierr); > ierr = > DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da2);CHKERRQ(ierr); > ierr = DMSetFromOptions(user.da2);CHKERRQ(ierr); > ierr = DMSetUp(user.da2);CHKERRQ(ierr); > ierr = DMCompositeAddDM(user.packer,user.da2);CHKERRQ(ierr); > ierr = DMRedundantCreate(PETSC_COMM_WORLD,0,1,&user.red1);CHKERRQ(ierr); > ierr = DMCompositeAddDM(user.packer,user.red1);CHKERRQ(ierr); > ...... > /* in FormJacobian(SNES snes,Vec U,Mat J,Mat B,void *ctx) */ > ierr = > DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); > ierr = > DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); > > ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr); > ierr = DMDAVecGetArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); > ierr = DMDAVecGetArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); > > ierr = DMCompositeGetLocalISs(user->packer,&is);CHKERRQ(ierr); > ierr = MatGetLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); > ierr = MatGetLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); > > for (j=ys; j for (i=xs; i row.j = j; row.i = i; > unit = 1.0; > ierr = > MatSetValuesStencil(B11,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr); > } > } > > for (j=ys; j for (i=xs; i row.j = j; row.i = i; > unit = 1.0; > ierr = > MatSetValuesStencil(B22,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr); > } > } > > ierr = MatRestoreLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); > ierr = MatRestoreLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); > > > unit = 1.0; > row1 = 960;//last row global index > > ierr = MatSetValues(B,1,&row1,1,&row1,&unit,INSERT_VALUES);CHKERRQ(ierr); > > ierr = VecRestoreArray(vlambda,&lambda);CHKERRQ(ierr); > ierr = DMDAVecRestoreArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); > ierr = DMDAVecRestoreArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); > ...... > > > Thanks, > Yingjie > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yjwu16 at gmail.com Mon Nov 5 09:37:01 2018 From: yjwu16 at gmail.com (Yingjie Wu) Date: Mon, 5 Nov 2018 23:37:01 +0800 Subject: [petsc-users] Problems about Assemble DMComposite Precondition Matrix In-Reply-To: References: Message-ID: Thank you very much for your reply. My equation is a neutron diffusion equation with eigenvalues, which is why I use DMConposite because there is a single non-physical field variable, eigenvalue. I am not very familiar with FieldSplit. I can understand it first. It seems not the problem of DmConposite because the reason of error reporting is that the diagonal elements of the precondition matrix have zero terms. My requirement is to divide precondition matrix to sub-matrices, because I have neutrons of multiple energy groups (each is a physical field). I want to assign only diagonal block matrices, preferably using MatSetValuesStencil (which can simplify the assignment of five diagonal matrices). Thanks, Yingjie Mark Adams ?2018?11?5??? ??11:22??? > DMComposite is not very mature, the last time I checked and I don't of > anyone having worked on it recently, and it is probably not what you want > anyway. FieldSplit is most likely what you want. > > What are your equations and discretization? eg, Stokes with cell centered > pressure? There are probably examples that are close to what you want and > it would not be hard to move your code over. > > Mark > > On Mon, Nov 5, 2018 at 10:00 AM Yingjie Wu via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Dear Petsc developer: >> Hi, >> I have recently studied the preconditioner of the program, and some >> problems have arisen. Please help me to solve them. >> At present, I have written a program to solve the system of non-linear >> equations. The Matrix Free method has been used to calculate results. But I >> want to add a preprocessing matrix to it. >> I used the DMComposite object, which stores two sub-DM objects and a >> single value (two physical field variables and one variable). I want to use >> MatGetLocalSubMatrix to assign each physical field sub precondition matrix. >> At the same time, because my DM object is two-dimensional, I use >> MatSetValuesStencil() to fill the sub matrix. >> At present, I just want to fill in a unit matrix (for global vectors) as >> the precondition matrix of Matrix Free Method (just to test whether it can >> be used, the unit matrix has no preprocessing effect). But the procedure >> was wrong. >> >> yjwu at yjwu-XPS-8910:~/petsc-3.10.1/src/snes/examples/tutorials$ >> mpiexec -n 1 ./ex216 -f wu-readtwogroups -snes_mf_operator -snes_view >> -snes_converged_reason -snes_monitor -ksp_converged_reason >> -ksp_monitor_true_residual >> >> 0 SNES Function norm 8.235090086536e-02 >> iter = 0, SNES Function norm 0.0823509 >> iter = 0, Keff ======= 1. >> Linear solve did not converge due to DIVERGED_PCSETUP_FAILED iterations >> 0 >> PCSETUP_FAILED due to FACTOR_NUMERIC_ZEROPIVOT >> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >> SNES Object: 1 MPI processes >> type: newtonls >> maximum iterations=50, maximum function evaluations=10000 >> tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 >> total number of linear solver iterations=0 >> total number of function evaluations=1 >> norm schedule ALWAYS >> SNESLineSearch Object: 1 MPI processes >> type: bt >> interpolation: cubic >> alpha=1.000000e-04 >> maxstep=1.000000e+08, minlambda=1.000000e-12 >> tolerances: relative=1.000000e-08, absolute=1.000000e-15, >> lambda=1.000000e-08 >> maximum iterations=40 >> KSP Object: 1 MPI processes >> type: gmres >> restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: 1 MPI processes >> type: ilu >> out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 1., needed 1. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=961, cols=961 >> package used to perform factorization: petsc >> total: nonzeros=4625, allocated nonzeros=4625 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix followed by preconditioner matrix: >> Mat Object: 1 MPI processes >> type: mffd >> rows=961, cols=961 >> Matrix-free approximation: >> err=1.49012e-08 (relative error in function evaluation) >> The compute h routine has not yet been set >> Mat Object: 1 MPI processes >> type: seqaij >> rows=961, cols=961 >> total: nonzeros=4625, allocated nonzeros=4625 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> >> It seems that there are elements on the diagonal line that are not >> filled, but I don't understand what went wrong. Part of the code of my >> program is as follows. I added all the code to the appendix. I have tested >> that for the entire Jacobian matrix assignment unit matrix (no submatrix, >> direct operation of the global preprocessing matrix), the program can run >> normally. However, since the problems developed later may be more complex, >> involving multiple physical fields, it would be easier to implement >> submatrix. >> >> ...... >> >> /* set DMComposite */ >> >> ierr = >> DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da1);CHKERRQ(ierr); >> ierr = DMSetFromOptions(user.da1);CHKERRQ(ierr); >> ierr = DMSetUp(user.da1);CHKERRQ(ierr); >> ierr = DMCompositeAddDM(user.packer,user.da1);CHKERRQ(ierr); >> ierr = >> DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da2);CHKERRQ(ierr); >> ierr = DMSetFromOptions(user.da2);CHKERRQ(ierr); >> ierr = DMSetUp(user.da2);CHKERRQ(ierr); >> ierr = DMCompositeAddDM(user.packer,user.da2);CHKERRQ(ierr); >> ierr = DMRedundantCreate(PETSC_COMM_WORLD,0,1,&user.red1);CHKERRQ(ierr); >> ierr = DMCompositeAddDM(user.packer,user.red1);CHKERRQ(ierr); >> ...... >> /* in FormJacobian(SNES snes,Vec U,Mat J,Mat B,void *ctx) */ >> ierr = >> DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); >> ierr = >> DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); >> >> ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr); >> ierr = DMDAVecGetArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); >> ierr = DMDAVecGetArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); >> >> ierr = DMCompositeGetLocalISs(user->packer,&is);CHKERRQ(ierr); >> ierr = MatGetLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); >> ierr = MatGetLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); >> >> for (j=ys; j> for (i=xs; i> row.j = j; row.i = i; >> unit = 1.0; >> ierr = >> MatSetValuesStencil(B11,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr); >> } >> } >> >> for (j=ys; j> for (i=xs; i> row.j = j; row.i = i; >> unit = 1.0; >> ierr = >> MatSetValuesStencil(B22,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr); >> } >> } >> >> ierr = MatRestoreLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); >> ierr = MatRestoreLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); >> >> >> unit = 1.0; >> row1 = 960;//last row global index >> >> ierr = >> MatSetValues(B,1,&row1,1,&row1,&unit,INSERT_VALUES);CHKERRQ(ierr); >> >> ierr = VecRestoreArray(vlambda,&lambda);CHKERRQ(ierr); >> ierr = DMDAVecRestoreArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); >> ierr = DMDAVecRestoreArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); >> ...... >> >> >> Thanks, >> Yingjie >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Nov 5 09:59:18 2018 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 5 Nov 2018 10:59:18 -0500 Subject: [petsc-users] Problems about Assemble DMComposite Precondition Matrix In-Reply-To: References: Message-ID: On Mon, Nov 5, 2018 at 10:37 AM Yingjie Wu wrote: > Thank you very much for your reply. > My equation is a neutron diffusion equation with eigenvalues, which is why > I use DMConposite because there is a single non-physical field variable, > eigenvalue. > OK, DMComposite might be your best choice. > I am not very familiar with FieldSplit. I can understand it first. > It seems not the problem of DmConposite because the reason of error > reporting is that the diagonal elements of the precondition matrix have > zero terms. > My requirement is to divide precondition matrix to sub-matrices, because I > have neutrons of multiple energy groups (each is a physical field). I want > to assign only diagonal block matrices, preferably using > MatSetValuesStencil (which can simplify the assignment of five diagonal > matrices). > So you are trying to create the identity matrix with MatSetValuesStencil but you are getting zeros on the diagonal. I would debug this by looking at the matrix in matlab (or Octave). You can use code like this: if (PETSC_TRUE) { PetscViewer viewer; ierr = PetscViewerASCIIOpen(comm, "Amat.m", &viewer);CHKERRQ(ierr); ierr = PetscViewerPushFormat(viewer, PETSC_VIEWER_ASCII_MATLAB);CHKERRQ(ierr); ierr = MatView(Amat,viewer);CHKERRQ(ierr); ierr = PetscViewerPopFormat(viewer);CHKERRQ(ierr); ierr = PetscViewerDestroy(&viewer); } Test on a small, serial problem and see where your data is actually going, if not on the diagonal. > Thanks, > Yingjie > > Mark Adams ?2018?11?5??? ??11:22??? > >> DMComposite is not very mature, the last time I checked and I don't of >> anyone having worked on it recently, and it is probably not what you want >> anyway. FieldSplit is most likely what you want. >> >> What are your equations and discretization? eg, Stokes with cell centered >> pressure? There are probably examples that are close to what you want and >> it would not be hard to move your code over. >> >> Mark >> >> On Mon, Nov 5, 2018 at 10:00 AM Yingjie Wu via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Dear Petsc developer: >>> Hi, >>> I have recently studied the preconditioner of the program, and some >>> problems have arisen. Please help me to solve them. >>> At present, I have written a program to solve the system of non-linear >>> equations. The Matrix Free method has been used to calculate results. But I >>> want to add a preprocessing matrix to it. >>> I used the DMComposite object, which stores two sub-DM objects and a >>> single value (two physical field variables and one variable). I want to use >>> MatGetLocalSubMatrix to assign each physical field sub precondition matrix. >>> At the same time, because my DM object is two-dimensional, I use >>> MatSetValuesStencil() to fill the sub matrix. >>> At present, I just want to fill in a unit matrix (for global vectors) as >>> the precondition matrix of Matrix Free Method (just to test whether it can >>> be used, the unit matrix has no preprocessing effect). But the procedure >>> was wrong. >>> >>> yjwu at yjwu-XPS-8910:~/petsc-3.10.1/src/snes/examples/tutorials$ >>> mpiexec -n 1 ./ex216 -f wu-readtwogroups -snes_mf_operator -snes_view >>> -snes_converged_reason -snes_monitor -ksp_converged_reason >>> -ksp_monitor_true_residual >>> >>> 0 SNES Function norm 8.235090086536e-02 >>> iter = 0, SNES Function norm 0.0823509 >>> iter = 0, Keff ======= 1. >>> Linear solve did not converge due to DIVERGED_PCSETUP_FAILED >>> iterations 0 >>> PCSETUP_FAILED due to FACTOR_NUMERIC_ZEROPIVOT >>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations >>> 0 >>> SNES Object: 1 MPI processes >>> type: newtonls >>> maximum iterations=50, maximum function evaluations=10000 >>> tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 >>> total number of linear solver iterations=0 >>> total number of function evaluations=1 >>> norm schedule ALWAYS >>> SNESLineSearch Object: 1 MPI processes >>> type: bt >>> interpolation: cubic >>> alpha=1.000000e-04 >>> maxstep=1.000000e+08, minlambda=1.000000e-12 >>> tolerances: relative=1.000000e-08, absolute=1.000000e-15, >>> lambda=1.000000e-08 >>> maximum iterations=40 >>> KSP Object: 1 MPI processes >>> type: gmres >>> restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: 1 MPI processes >>> type: ilu >>> out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> matrix ordering: natural >>> factor fill ratio given 1., needed 1. >>> Factored matrix follows: >>> Mat Object: 1 MPI processes >>> type: seqaij >>> rows=961, cols=961 >>> package used to perform factorization: petsc >>> total: nonzeros=4625, allocated nonzeros=4625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix followed by preconditioner matrix: >>> Mat Object: 1 MPI processes >>> type: mffd >>> rows=961, cols=961 >>> Matrix-free approximation: >>> err=1.49012e-08 (relative error in function evaluation) >>> The compute h routine has not yet been set >>> Mat Object: 1 MPI processes >>> type: seqaij >>> rows=961, cols=961 >>> total: nonzeros=4625, allocated nonzeros=4625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> >>> It seems that there are elements on the diagonal line that are not >>> filled, but I don't understand what went wrong. Part of the code of my >>> program is as follows. I added all the code to the appendix. I have tested >>> that for the entire Jacobian matrix assignment unit matrix (no submatrix, >>> direct operation of the global preprocessing matrix), the program can run >>> normally. However, since the problems developed later may be more complex, >>> involving multiple physical fields, it would be easier to implement >>> submatrix. >>> >>> ...... >>> >>> /* set DMComposite */ >>> >>> ierr = >>> DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da1);CHKERRQ(ierr); >>> ierr = DMSetFromOptions(user.da1);CHKERRQ(ierr); >>> ierr = DMSetUp(user.da1);CHKERRQ(ierr); >>> ierr = DMCompositeAddDM(user.packer,user.da1);CHKERRQ(ierr); >>> ierr = >>> DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da2);CHKERRQ(ierr); >>> ierr = DMSetFromOptions(user.da2);CHKERRQ(ierr); >>> ierr = DMSetUp(user.da2);CHKERRQ(ierr); >>> ierr = DMCompositeAddDM(user.packer,user.da2);CHKERRQ(ierr); >>> ierr = >>> DMRedundantCreate(PETSC_COMM_WORLD,0,1,&user.red1);CHKERRQ(ierr); >>> ierr = DMCompositeAddDM(user.packer,user.red1);CHKERRQ(ierr); >>> ...... >>> /* in FormJacobian(SNES snes,Vec U,Mat J,Mat B,void *ctx) */ >>> ierr = >>> DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); >>> ierr = >>> DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); >>> >>> ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr); >>> ierr = DMDAVecGetArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); >>> ierr = DMDAVecGetArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); >>> >>> ierr = DMCompositeGetLocalISs(user->packer,&is);CHKERRQ(ierr); >>> ierr = MatGetLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); >>> ierr = MatGetLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); >>> >>> for (j=ys; j>> for (i=xs; i>> row.j = j; row.i = i; >>> unit = 1.0; >>> ierr = >>> MatSetValuesStencil(B11,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr); >>> } >>> } >>> >>> for (j=ys; j>> for (i=xs; i>> row.j = j; row.i = i; >>> unit = 1.0; >>> ierr = >>> MatSetValuesStencil(B22,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr); >>> } >>> } >>> >>> ierr = MatRestoreLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); >>> ierr = MatRestoreLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); >>> >>> >>> unit = 1.0; >>> row1 = 960;//last row global index >>> >>> ierr = >>> MatSetValues(B,1,&row1,1,&row1,&unit,INSERT_VALUES);CHKERRQ(ierr); >>> >>> ierr = VecRestoreArray(vlambda,&lambda);CHKERRQ(ierr); >>> ierr = DMDAVecRestoreArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); >>> ierr = DMDAVecRestoreArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); >>> ...... >>> >>> >>> Thanks, >>> Yingjie >>> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Nov 5 10:02:54 2018 From: jed at jedbrown.org (Jed Brown) Date: Mon, 05 Nov 2018 09:02:54 -0700 Subject: [petsc-users] Problems about Assemble DMComposite Precondition Matrix In-Reply-To: References: Message-ID: <87woprpl5d.fsf@jedbrown.org> Yingjie Wu via petsc-users writes: > Thank you very much for your reply. > My equation is a neutron diffusion equation with eigenvalues, which is why > I use DMConposite because there is a single non-physical field variable, > eigenvalue. I am not very familiar with FieldSplit. I can understand it > first. > It seems not the problem of DmConposite because the reason of error > reporting is that the diagonal elements of the precondition matrix have > zero terms. Which entries are zero? You can view the matrix to find out if it has the structure you intend. Make sure the index sets is[] are what you intend. > My requirement is to divide precondition matrix to sub-matrices, because I > have neutrons of multiple energy groups (each is a physical field). I want > to assign only diagonal block matrices, preferably using > MatSetValuesStencil (which can simplify the assignment of five diagonal > matrices). Splitting all of those apart with DMComposite is clumsy and may deliver poor memory performance. Consider DMDASetBlockFills if you want to allocate a matrix that has sparsity within blocks. > > Thanks, > Yingjie > > Mark Adams ?2018?11?5??? ??11:22??? > >> DMComposite is not very mature, the last time I checked and I don't of >> anyone having worked on it recently, and it is probably not what you want >> anyway. FieldSplit is most likely what you want. >> >> What are your equations and discretization? eg, Stokes with cell centered >> pressure? There are probably examples that are close to what you want and >> it would not be hard to move your code over. >> >> Mark >> >> On Mon, Nov 5, 2018 at 10:00 AM Yingjie Wu via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Dear Petsc developer: >>> Hi, >>> I have recently studied the preconditioner of the program, and some >>> problems have arisen. Please help me to solve them. >>> At present, I have written a program to solve the system of non-linear >>> equations. The Matrix Free method has been used to calculate results. But I >>> want to add a preprocessing matrix to it. >>> I used the DMComposite object, which stores two sub-DM objects and a >>> single value (two physical field variables and one variable). I want to use >>> MatGetLocalSubMatrix to assign each physical field sub precondition matrix. >>> At the same time, because my DM object is two-dimensional, I use >>> MatSetValuesStencil() to fill the sub matrix. >>> At present, I just want to fill in a unit matrix (for global vectors) as >>> the precondition matrix of Matrix Free Method (just to test whether it can >>> be used, the unit matrix has no preprocessing effect). But the procedure >>> was wrong. >>> >>> yjwu at yjwu-XPS-8910:~/petsc-3.10.1/src/snes/examples/tutorials$ >>> mpiexec -n 1 ./ex216 -f wu-readtwogroups -snes_mf_operator -snes_view >>> -snes_converged_reason -snes_monitor -ksp_converged_reason >>> -ksp_monitor_true_residual >>> >>> 0 SNES Function norm 8.235090086536e-02 >>> iter = 0, SNES Function norm 0.0823509 >>> iter = 0, Keff ======= 1. >>> Linear solve did not converge due to DIVERGED_PCSETUP_FAILED iterations >>> 0 >>> PCSETUP_FAILED due to FACTOR_NUMERIC_ZEROPIVOT >>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >>> SNES Object: 1 MPI processes >>> type: newtonls >>> maximum iterations=50, maximum function evaluations=10000 >>> tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 >>> total number of linear solver iterations=0 >>> total number of function evaluations=1 >>> norm schedule ALWAYS >>> SNESLineSearch Object: 1 MPI processes >>> type: bt >>> interpolation: cubic >>> alpha=1.000000e-04 >>> maxstep=1.000000e+08, minlambda=1.000000e-12 >>> tolerances: relative=1.000000e-08, absolute=1.000000e-15, >>> lambda=1.000000e-08 >>> maximum iterations=40 >>> KSP Object: 1 MPI processes >>> type: gmres >>> restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: 1 MPI processes >>> type: ilu >>> out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> matrix ordering: natural >>> factor fill ratio given 1., needed 1. >>> Factored matrix follows: >>> Mat Object: 1 MPI processes >>> type: seqaij >>> rows=961, cols=961 >>> package used to perform factorization: petsc >>> total: nonzeros=4625, allocated nonzeros=4625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix followed by preconditioner matrix: >>> Mat Object: 1 MPI processes >>> type: mffd >>> rows=961, cols=961 >>> Matrix-free approximation: >>> err=1.49012e-08 (relative error in function evaluation) >>> The compute h routine has not yet been set >>> Mat Object: 1 MPI processes >>> type: seqaij >>> rows=961, cols=961 >>> total: nonzeros=4625, allocated nonzeros=4625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> >>> It seems that there are elements on the diagonal line that are not >>> filled, but I don't understand what went wrong. Part of the code of my >>> program is as follows. I added all the code to the appendix. I have tested >>> that for the entire Jacobian matrix assignment unit matrix (no submatrix, >>> direct operation of the global preprocessing matrix), the program can run >>> normally. However, since the problems developed later may be more complex, >>> involving multiple physical fields, it would be easier to implement >>> submatrix. >>> >>> ...... >>> >>> /* set DMComposite */ >>> >>> ierr = >>> DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da1);CHKERRQ(ierr); >>> ierr = DMSetFromOptions(user.da1);CHKERRQ(ierr); >>> ierr = DMSetUp(user.da1);CHKERRQ(ierr); >>> ierr = DMCompositeAddDM(user.packer,user.da1);CHKERRQ(ierr); >>> ierr = >>> DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da2);CHKERRQ(ierr); >>> ierr = DMSetFromOptions(user.da2);CHKERRQ(ierr); >>> ierr = DMSetUp(user.da2);CHKERRQ(ierr); >>> ierr = DMCompositeAddDM(user.packer,user.da2);CHKERRQ(ierr); >>> ierr = DMRedundantCreate(PETSC_COMM_WORLD,0,1,&user.red1);CHKERRQ(ierr); >>> ierr = DMCompositeAddDM(user.packer,user.red1);CHKERRQ(ierr); >>> ...... >>> /* in FormJacobian(SNES snes,Vec U,Mat J,Mat B,void *ctx) */ >>> ierr = >>> DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr); >>> ierr = >>> DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr); >>> >>> ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr); >>> ierr = DMDAVecGetArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); >>> ierr = DMDAVecGetArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); >>> >>> ierr = DMCompositeGetLocalISs(user->packer,&is);CHKERRQ(ierr); >>> ierr = MatGetLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); >>> ierr = MatGetLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); >>> >>> for (j=ys; j>> for (i=xs; i>> row.j = j; row.i = i; >>> unit = 1.0; >>> ierr = >>> MatSetValuesStencil(B11,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr); >>> } >>> } >>> >>> for (j=ys; j>> for (i=xs; i>> row.j = j; row.i = i; >>> unit = 1.0; >>> ierr = >>> MatSetValuesStencil(B22,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr); >>> } >>> } >>> >>> ierr = MatRestoreLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr); >>> ierr = MatRestoreLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr); >>> >>> >>> unit = 1.0; >>> row1 = 960;//last row global index >>> >>> ierr = >>> MatSetValues(B,1,&row1,1,&row1,&unit,INSERT_VALUES);CHKERRQ(ierr); >>> >>> ierr = VecRestoreArray(vlambda,&lambda);CHKERRQ(ierr); >>> ierr = DMDAVecRestoreArray(user->da1,vphi1,&phi1);CHKERRQ(ierr); >>> ierr = DMDAVecRestoreArray(user->da2,vphi2,&phi2);CHKERRQ(ierr); >>> ...... >>> >>> >>> Thanks, >>> Yingjie >>> >> From knepley at gmail.com Mon Nov 5 10:17:31 2018 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 5 Nov 2018 11:17:31 -0500 Subject: [petsc-users] Force SNES diverge In-Reply-To: <98E27DD4-6CAA-4A97-9668-7A796957B9EB@gmail.com> References: <98E27DD4-6CAA-4A97-9668-7A796957B9EB@gmail.com> Message-ID: On Mon, Nov 5, 2018 at 8:41 AM Karol Lewandowski via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hi, > > I am solving a highly nonlinear problem using SNES solver. Under certain > conditions during the iterations I already know that the step will diverge > (in the next few iterations). Is there a clever way to tell SNES to diverge > immediately so I can start a new step with reduced step size? > Set your own convergence test I think: https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESSetConvergenceTest.html Matt > Thank you, > > Karol -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Nov 5 10:48:57 2018 From: jed at jedbrown.org (Jed Brown) Date: Mon, 05 Nov 2018 09:48:57 -0700 Subject: [petsc-users] Vec, Mat and binaryfiles. In-Reply-To: References: Message-ID: <87r2fzpj0m.fsf@jedbrown.org> Sal Am via petsc-users writes: > Hi, > > I am trying to solve a Ax=b complex system. the vector b and "matrix" A are > both binary and NOT created by PETSc. So I keep getting error messages that > they are not correct format when I read the files with PetscViewBinaryOpen, > after some digging it seems that one cannot just read a binary file that > was created by another software. Yes, of course the formats would have to match. I would recommend writing the files in an existing format such as PETSc's binary format. While the method you describe can be made to work, it will be more work to make it parallel. > How would I go on to solve this problem? > > More info and trials: > > "matrix" A consists of two files, one that contains row column index > numbers and one that contains the non-zero values. So what I would have to > do is multiply the last term in a+b with PETSC_i to get a real + imaginary > vector A. > > vector b is in binary, so what I have done so far (not sure if it works) is: > > std::ifstream input("Vector_b.bin", std::ios::binary ); > while (input.read(reinterpret_cast(&v), sizeof(float))) > ierr = VecSetValues(u,1,&iglobal,&v,INSERT_VALUES);CHKERRQ(ierr); > > where v is a PetscScalar. > > Once I am able to read both matrices I think I can figure out the solvers > to solve the system. > > All the best, > S From bsmith at mcs.anl.gov Mon Nov 5 10:49:09 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Mon, 5 Nov 2018 16:49:09 +0000 Subject: [petsc-users] PetscViewerFileSetName() wrong error code in Fortran In-Reply-To: References: Message-ID: Fixed in the branch barry/fix-fortran-petscfileviewersetname soon to be fixed in maint and then the next patch release of PETSc. Thanks for the report Barry > On Nov 5, 2018, at 7:29 AM, Tim Steinhoff via petsc-users wrote: > > Dear PETSc Team, > > I am having the issue that petscviewerfilesetname_() and > petscviewerfilegetname_() in > src\sys\classes\viewer\impls\ascii\ftn-custom\zfilevf.c (Fortran) do > not return proprer error codes, as those are not checked immediately > after the actual call and will be overwritten afterwards within the > macro FREECHAR() or by other subsequent calls respectively. > > It would be very helpful if you could fix that. > > Thanks and king regards, > > Volker From t.appel17 at imperial.ac.uk Mon Nov 5 11:49:22 2018 From: t.appel17 at imperial.ac.uk (Thibaut Appel) Date: Mon, 5 Nov 2018 17:49:22 +0000 Subject: [petsc-users] DIVERGED_NANORING with PC GAMG In-Reply-To: References: <34D20379-DEAC-4E98-8FA8-E7EC2831BE6D@ic.ac.uk> <991561e5-1b9a-dbb2-333d-501251967077@imperial.ac.uk> Message-ID: <16cb9915-2b01-00ae-5060-966c0b4b48d3@imperial.ac.uk> Hi Mark, Yes it doesn't seem to be usable. Unfortunately we're aiming to do 3D so direct solvers are not a viable solution and PETSc' ILU is not parallel and we can't use HYPRE (complex arithmetic) Thibaut On 01/11/2018 20:42, Mark Adams wrote: > > > On Wed, Oct 31, 2018 at 8:11 PM Smith, Barry F. > wrote: > > > > > On Oct 31, 2018, at 5:39 PM, Appel, Thibaut via petsc-users > > wrote: > > > > Well yes naturally for the residual but adding > -ksp_true_residual just gives > > > >? ?0 KSP unpreconditioned resid norm 3.583290589961e+00 true > resid norm 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 > >? ?1 KSP unpreconditioned resid norm 0.000000000000e+00 true > resid norm 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 > > Linear solve converged due to CONVERGED_ATOL iterations 1 > > ? ?Very bad stuff is happening in the preconditioner. The > preconditioner must have a null space (which it shouldn't have to > be a useful preconditioner). > > > Yea, you are far away from an optimal preconditioner for this system. > In low frequency (indefinite) Helmholtz is very very hard. Now, > something very bad is going on here but even if you fix it standard > AMG is not good for these problems. I would use direct solvers or > grind away it with ILU. > > > > > > Mark - if that helps - a Poisson equation is used for the > pressure so the Helmholtz is the same as for the velocity in the > interior. > > > > Thibaut > > > >> Le 31 oct. 2018 ? 21:05, Mark Adams > a ?crit : > >> > >> These are indefinite (bad) Helmholtz problems. Right? > >> > >> On Wed, Oct 31, 2018 at 2:38 PM Matthew Knepley > > wrote: > >> On Wed, Oct 31, 2018 at 2:13 PM Thibaut Appel > > wrote: > >> Hi Mark, Matthew, > >> > >> Thanks for taking the time. > >> > >> 1) You're not suggesting having -fieldsplit_X_ksp_type fgmres > for each field, are you? > >> > >> 2) No, the matrix has pressure in one of the fields. Here it's > a 2D problem (but we're also doing 3D), the unknowns are (p,u,v) > and those are my 3 fields. We are dealing with subsonic/transsonic > flows so it is convection dominated indeed. > >> > >> 3) We are in frequency domain with respect to time, i.e. > \partial{phi}/\partial{t} = -i*omega*phi. > >> > >> 4) Hypre is unfortunately not an option since we are in complex > arithmetic. > >> > >> > >> > >>> I'm not sure about "-fieldsplit_pc_type gamg" GAMG should work > on one block, and hence be a subpc. I'm not up on fieldsplit syntax. > >> According to the online manual page this syntax applies the > suffix to all the defined fields? > >> > >> > >> > >>> Mark is correct. I wanted you to change the smoother. He shows > how to change it to Richardson (make sure you add the self-scale > option), which is probably the best choice. > >>> > >>>? ?Thanks, > >>> > >>>? ? ? Matt > >> > >> You did tell me to set it to GMRES if I'm not mistaken, that's > why I tried "-fieldsplit_mg_levels_ksp_type gmres" (mentioned in > the email). Also, it wasn't clear whether these should be applied > to each block or the whole system, as the online manual pages + > .pdf manual barely mention smoothers and how to manipulate MG > objects with KSP/PC, this especially with PCFIELDSPLIT where > examples are scarce. > >> > >> From what I can gather from your suggestions I tried (lines > with X are repeated for X={0,1,2}) > >> > >> This looks good. How can an identically zero vector produce a 0 > residual? You should always monitor with > >> > >>? ?-ksp_monitor_true_residual. > >> > >>? ? Thanks, > >> > >>? ? ?Matt > >> -ksp_view_pre -ksp_monitor -ksp_converged_reason \ > >> -ksp_type fgmres -ksp_rtol 1.0e-8 \ > >> -pc_type fieldsplit \ > >> -pc_fieldsplit_type multiplicative \ > >> -pc_fieldsplit_block_size 3 \ > >> -pc_fieldsplit_0_fields 0 \ > >> -pc_fieldsplit_1_fields 1 \ > >> -pc_fieldsplit_2_fields 2 \ > >> -fieldsplit_X_pc_type gamg \ > >> -fieldsplit_X_ksp_type gmres \ > >> -fieldsplit_X_ksp_rtol 1e-10 \ > >> -fieldsplit_X_mg_levels_ksp_type richardson \ > >> -fieldsplit_X_mg_levels_pc_type sor \ > >> -fieldsplit_X_pc_gamg_agg_nsmooths 0 \ > >> -fieldsplit_X_mg_levels_ksp_richardson_self_scale \ > >> -log_view > >> > >> which yields > >> > >> KSP Object: 1 MPI processes > >>? ?type: fgmres > >>? ? ?restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > >>? ? ?happy breakdown tolerance 1e-30 > >>? ?maximum iterations=10000, initial guess is zero > >>? ?tolerances:? relative=1e-08, absolute=1e-50, divergence=10000. > >>? ?left preconditioning > >>? ?using DEFAULT norm type for convergence test > >> PC Object: 1 MPI processes > >>? ?type: fieldsplit > >>? ?PC has not been set up so information may be incomplete > >>? ? ?FieldSplit with MULTIPLICATIVE composition: total splits = > 3, blocksize = 3 > >>? ? ?Solver info for each split is in the following KSP objects: > >>? ?Split number 0 Fields? 0 > >>? ?KSP Object: (fieldsplit_0_) 1 MPI processes > >>? ? ?type: preonly > >>? ? ?maximum iterations=10000, initial guess is zero > >>? ? ?tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. > >>? ? ?left preconditioning > >>? ? ?using DEFAULT norm type for convergence test > >>? ?PC Object: (fieldsplit_0_) 1 MPI processes > >>? ? ?type not yet set > >>? ? ?PC has not been set up so information may be incomplete > >>? ?Split number 1 Fields? 1 > >>? ?KSP Object: (fieldsplit_1_) 1 MPI processes > >>? ? ?type: preonly > >>? ? ?maximum iterations=10000, initial guess is zero > >>? ? ?tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. > >>? ? ?left preconditioning > >>? ? ?using DEFAULT norm type for convergence test > >>? ?PC Object: (fieldsplit_1_) 1 MPI processes > >>? ? ?type not yet set > >>? ? ?PC has not been set up so information may be incomplete > >>? ?Split number 2 Fields? 2 > >>? ?KSP Object: (fieldsplit_2_) 1 MPI processes > >>? ? ?type: preonly > >>? ? ?maximum iterations=10000, initial guess is zero > >>? ? ?tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. > >>? ? ?left preconditioning > >>? ? ?using DEFAULT norm type for convergence test > >>? ?PC Object: (fieldsplit_2_) 1 MPI processes > >>? ? ?type not yet set > >>? ? ?PC has not been set up so information may be incomplete > >>? ?linear system matrix = precond matrix: > >>? ?Mat Object: 1 MPI processes > >>? ? ?type: seqaij > >>? ? ?rows=52500, cols=52500 > >>? ? ?total: nonzeros=1127079, allocated nonzeros=1128624 > >>? ? ?total number of mallocs used during MatSetValues calls =0 > >>? ? ? ?not using I-node routines > >>? ?0 KSP Residual norm 3.583290589961e+00 > >>? ?1 KSP Residual norm 0.000000000000e+00 > >> Linear solve converged due to CONVERGED_ATOL iterations 1 > >> > >> so something must not be set correctly. The solution is > identically zero everywhere. > >> > >> Is that option list what you meant? If you could let me know > what should be corrected. > >> > >> > >> > >> Thanks for your support, > >> > >> > >> > >> Thibaut > >> > >> > >> > >> On 31/10/2018 16:43, Mark Adams wrote: > >>> > >>> > >>> On Tue, Oct 30, 2018 at 5:23 PM Appel, Thibaut via petsc-users > > wrote: > >>> Dear users, > >>> > >>> Following a suggestion from Matthew Knepley I?ve been trying > to apply fieldsplit/gamg for my set of PDEs but I?m still > encountering issues despite various tests. pc_gamg simply won?t start. > >>> Note that direct solvers always yield the correct, physical > result. > >>> Removing the fieldsplit to focus on the gamg bit and trying to > solve the linear system on a modest size problem still gives, with > >>> > >>> '-ksp_monitor -ksp_rtol 1.0e-10 -ksp_gmres_restart 300 > -ksp_type gmres -pc_type gamg' > >>> > >>> [3]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > >>> [3]PETSC ERROR: Petsc has generated inconsistent data > >>> [3]PETSC ERROR: Have un-symmetric graph (apparently). Use > '-(null)pc_gamg_sym_graph true' to symetrize the graph or > '-(null)pc_gamg_threshold -1' if the matrix is structurally symmetric. > >>> > >>> And since then, after adding '-pc_gamg_sym_graph true' I have > been getting > >>> [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > >>> [0]PETSC ERROR: Petsc has generated inconsistent data > >>> [0]PETSC ERROR: Eigen estimator failed: DIVERGED_NANORINF at > iteration > >>> > >>> -ksp_chebyshev_esteig_noisy 0/1 does not change anything > >>> > >>> Knowing that Chebyshev eigen estimator needs a positive > spectrum I tried ?-mg_levels_ksp_type gmres? but iterations would > just go on endlessly. > >>> > >>> This is OK, but you need to use '-ksp_type fgmres' (this could > be why it is failing ...). > >>> > >>> It looks like your matrix is 1) just the velocity field and 2) > very unsymmetric (eg, convection dominated). I would start with > ?-mg_levels_ksp_type richardson -mg_levels_pc_type sor?. > >>> > >>> I would also start with unsmoothed aggregation: > '-pc_gamg_nsmooths 0' > >>> > >>> > >>> It seems that I have indeed eigenvalues of rather high > magnitude in the spectrum of my operator without being able to > determine the reason. > >>> The eigenvectors look like small artifacts at the wall-inflow > or wall-outflow corners with zero anywhere else but I do not know > how to interpret this. > >>> Equations are time-harmonic linearized Navier-Stokes to which > a forcing is applied, there?s no time-marching. > >>> > >>> You mean you are in frequency domain? > >>> > >>> > >>> Matrix is formed with a MPIAIJ type. The formulation is > incompressible, in complex arithmetic and the 2D physical domain > is mapped to a logically rectangular, > >>> > >>> This kind of messes up the null space that AMG depends on but > AMG theory is gone for NS anyway. > >>> > >>> regular collocated grid with a high-order finite difference > method. > >>> I determine the ownership of the rows/degrees of freedom of > the matrix with PetscSplitOwnership and I?m not using DMDA. > >>> > >>> Our iterative solvers are probably not going to work well on > this but you should test hypre also (-pc_type hypre -pc_hypre_type > boomeramg). You need to configure PETSc to download hypre. > >>> > >>> Mark > >>> > >>> > >>> The Fortran application code is memory-leak free and has > undergone a strict verification/validation procedure for different > variations of the PDEs. > >>> > >>> If there?s any problem with the matrix what could help for the > diagnostic? At this point I?m running out of ideas so I would > really appreciate additional suggestions and discussions. > >>> > >>> Thanks for your continued support, > >>> > >>> > >>> Thibaut > >> > >> > >> -- > >> What most experimenters take for granted before they begin > their experiments is infinitely more interesting than any results > to which their experiments lead. > >> -- Norbert Wiener > >> > >> https://www.cse.buffalo.edu/~knepley/ > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Nov 5 14:12:10 2018 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 5 Nov 2018 15:12:10 -0500 Subject: [petsc-users] DIVERGED_NANORING with PC GAMG In-Reply-To: <16cb9915-2b01-00ae-5060-966c0b4b48d3@imperial.ac.uk> References: <34D20379-DEAC-4E98-8FA8-E7EC2831BE6D@ic.ac.uk> <991561e5-1b9a-dbb2-333d-501251967077@imperial.ac.uk> <16cb9915-2b01-00ae-5060-966c0b4b48d3@imperial.ac.uk> Message-ID: On Mon, Nov 5, 2018 at 12:50 PM Thibaut Appel wrote: > Hi Mark, > > Yes it doesn't seem to be usable. Unfortunately we're aiming to do 3D so > direct solvers are not a viable solution and PETSc' ILU is not parallel and > we can't use HYPRE (complex arithmetic) > I think SuperLU has a parallel ILU but in my opinion parallel ILU is not a big deal. Neither is optimal and the math win (faster convergence) with parallel is offset by the cost of synchronization, in some form, for a true parallel ILU. So I think the PETSc default gmres/(local)ILU is your best option. > Thibaut > On 01/11/2018 20:42, Mark Adams wrote: > > > > On Wed, Oct 31, 2018 at 8:11 PM Smith, Barry F. > wrote: > >> >> >> > On Oct 31, 2018, at 5:39 PM, Appel, Thibaut via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> > >> > Well yes naturally for the residual but adding -ksp_true_residual just >> gives >> > >> > 0 KSP unpreconditioned resid norm 3.583290589961e+00 true resid norm >> 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > 1 KSP unpreconditioned resid norm 0.000000000000e+00 true resid norm >> 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > Linear solve converged due to CONVERGED_ATOL iterations 1 >> >> Very bad stuff is happening in the preconditioner. The preconditioner >> must have a null space (which it shouldn't have to be a useful >> preconditioner). >> > > Yea, you are far away from an optimal preconditioner for this system. In > low frequency (indefinite) Helmholtz is very very hard. Now, something very > bad is going on here but even if you fix it standard AMG is not good for > these problems. I would use direct solvers or grind away it with ILU. > > >> >> > >> > Mark - if that helps - a Poisson equation is used for the pressure so >> the Helmholtz is the same as for the velocity in the interior. >> > >> > Thibaut >> > >> >> Le 31 oct. 2018 ? 21:05, Mark Adams a ?crit : >> >> >> >> These are indefinite (bad) Helmholtz problems. Right? >> >> >> >> On Wed, Oct 31, 2018 at 2:38 PM Matthew Knepley >> wrote: >> >> On Wed, Oct 31, 2018 at 2:13 PM Thibaut Appel < >> t.appel17 at imperial.ac.uk> wrote: >> >> Hi Mark, Matthew, >> >> >> >> Thanks for taking the time. >> >> >> >> 1) You're not suggesting having -fieldsplit_X_ksp_type fgmres for each >> field, are you? >> >> >> >> 2) No, the matrix has pressure in one of the fields. Here it's a 2D >> problem (but we're also doing 3D), the unknowns are (p,u,v) and those are >> my 3 fields. We are dealing with subsonic/transsonic flows so it is >> convection dominated indeed. >> >> >> >> 3) We are in frequency domain with respect to time, i.e. >> \partial{phi}/\partial{t} = -i*omega*phi. >> >> >> >> 4) Hypre is unfortunately not an option since we are in complex >> arithmetic. >> >> >> >> >> >> >> >>> I'm not sure about "-fieldsplit_pc_type gamg" GAMG should work on one >> block, and hence be a subpc. I'm not up on fieldsplit syntax. >> >> According to the online manual page this syntax applies the suffix to >> all the defined fields? >> >> >> >> >> >> >> >>> Mark is correct. I wanted you to change the smoother. He shows how to >> change it to Richardson (make sure you add the self-scale option), which is >> probably the best choice. >> >>> >> >>> Thanks, >> >>> >> >>> Matt >> >> >> >> You did tell me to set it to GMRES if I'm not mistaken, that's why I >> tried "-fieldsplit_mg_levels_ksp_type gmres" (mentioned in the email). >> Also, it wasn't clear whether these should be applied to each block or the >> whole system, as the online manual pages + .pdf manual barely mention >> smoothers and how to manipulate MG objects with KSP/PC, this especially >> with PCFIELDSPLIT where examples are scarce. >> >> >> >> From what I can gather from your suggestions I tried (lines with X are >> repeated for X={0,1,2}) >> >> >> >> This looks good. How can an identically zero vector produce a 0 >> residual? You should always monitor with >> >> >> >> -ksp_monitor_true_residual. >> >> >> >> Thanks, >> >> >> >> Matt >> >> -ksp_view_pre -ksp_monitor -ksp_converged_reason \ >> >> -ksp_type fgmres -ksp_rtol 1.0e-8 \ >> >> -pc_type fieldsplit \ >> >> -pc_fieldsplit_type multiplicative \ >> >> -pc_fieldsplit_block_size 3 \ >> >> -pc_fieldsplit_0_fields 0 \ >> >> -pc_fieldsplit_1_fields 1 \ >> >> -pc_fieldsplit_2_fields 2 \ >> >> -fieldsplit_X_pc_type gamg \ >> >> -fieldsplit_X_ksp_type gmres \ >> >> -fieldsplit_X_ksp_rtol 1e-10 \ >> >> -fieldsplit_X_mg_levels_ksp_type richardson \ >> >> -fieldsplit_X_mg_levels_pc_type sor \ >> >> -fieldsplit_X_pc_gamg_agg_nsmooths 0 \ >> >> -fieldsplit_X_mg_levels_ksp_richardson_self_scale \ >> >> -log_view >> >> >> >> which yields >> >> >> >> KSP Object: 1 MPI processes >> >> type: fgmres >> >> restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> >> happy breakdown tolerance 1e-30 >> >> maximum iterations=10000, initial guess is zero >> >> tolerances: relative=1e-08, absolute=1e-50, divergence=10000. >> >> left preconditioning >> >> using DEFAULT norm type for convergence test >> >> PC Object: 1 MPI processes >> >> type: fieldsplit >> >> PC has not been set up so information may be incomplete >> >> FieldSplit with MULTIPLICATIVE composition: total splits = 3, >> blocksize = 3 >> >> Solver info for each split is in the following KSP objects: >> >> Split number 0 Fields 0 >> >> KSP Object: (fieldsplit_0_) 1 MPI processes >> >> type: preonly >> >> maximum iterations=10000, initial guess is zero >> >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> >> left preconditioning >> >> using DEFAULT norm type for convergence test >> >> PC Object: (fieldsplit_0_) 1 MPI processes >> >> type not yet set >> >> PC has not been set up so information may be incomplete >> >> Split number 1 Fields 1 >> >> KSP Object: (fieldsplit_1_) 1 MPI processes >> >> type: preonly >> >> maximum iterations=10000, initial guess is zero >> >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> >> left preconditioning >> >> using DEFAULT norm type for convergence test >> >> PC Object: (fieldsplit_1_) 1 MPI processes >> >> type not yet set >> >> PC has not been set up so information may be incomplete >> >> Split number 2 Fields 2 >> >> KSP Object: (fieldsplit_2_) 1 MPI processes >> >> type: preonly >> >> maximum iterations=10000, initial guess is zero >> >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> >> left preconditioning >> >> using DEFAULT norm type for convergence test >> >> PC Object: (fieldsplit_2_) 1 MPI processes >> >> type not yet set >> >> PC has not been set up so information may be incomplete >> >> linear system matrix = precond matrix: >> >> Mat Object: 1 MPI processes >> >> type: seqaij >> >> rows=52500, cols=52500 >> >> total: nonzeros=1127079, allocated nonzeros=1128624 >> >> total number of mallocs used during MatSetValues calls =0 >> >> not using I-node routines >> >> 0 KSP Residual norm 3.583290589961e+00 >> >> 1 KSP Residual norm 0.000000000000e+00 >> >> Linear solve converged due to CONVERGED_ATOL iterations 1 >> >> >> >> so something must not be set correctly. The solution is identically >> zero everywhere. >> >> >> >> Is that option list what you meant? If you could let me know what >> should be corrected. >> >> >> >> >> >> >> >> Thanks for your support, >> >> >> >> >> >> >> >> Thibaut >> >> >> >> >> >> >> >> On 31/10/2018 16:43, Mark Adams wrote: >> >>> >> >>> >> >>> On Tue, Oct 30, 2018 at 5:23 PM Appel, Thibaut via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Dear users, >> >>> >> >>> Following a suggestion from Matthew Knepley I?ve been trying to apply >> fieldsplit/gamg for my set of PDEs but I?m still encountering issues >> despite various tests. pc_gamg simply won?t start. >> >>> Note that direct solvers always yield the correct, physical result. >> >>> Removing the fieldsplit to focus on the gamg bit and trying to solve >> the linear system on a modest size problem still gives, with >> >>> >> >>> '-ksp_monitor -ksp_rtol 1.0e-10 -ksp_gmres_restart 300 -ksp_type >> gmres -pc_type gamg' >> >>> >> >>> [3]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> >>> [3]PETSC ERROR: Petsc has generated inconsistent data >> >>> [3]PETSC ERROR: Have un-symmetric graph (apparently). Use >> '-(null)pc_gamg_sym_graph true' to symetrize the graph or >> '-(null)pc_gamg_threshold -1' if the matrix is structurally symmetric. >> >>> >> >>> And since then, after adding '-pc_gamg_sym_graph true' I have been >> getting >> >>> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> >>> [0]PETSC ERROR: Petsc has generated inconsistent data >> >>> [0]PETSC ERROR: Eigen estimator failed: DIVERGED_NANORINF at iteration >> >>> >> >>> -ksp_chebyshev_esteig_noisy 0/1 does not change anything >> >>> >> >>> Knowing that Chebyshev eigen estimator needs a positive spectrum I >> tried ?-mg_levels_ksp_type gmres? but iterations would just go on endlessly. >> >>> >> >>> This is OK, but you need to use '-ksp_type fgmres' (this could be why >> it is failing ...). >> >>> >> >>> It looks like your matrix is 1) just the velocity field and 2) very >> unsymmetric (eg, convection dominated). I would start with >> ?-mg_levels_ksp_type richardson -mg_levels_pc_type sor?. >> >>> >> >>> I would also start with unsmoothed aggregation: '-pc_gamg_nsmooths 0' >> >>> >> >>> >> >>> It seems that I have indeed eigenvalues of rather high magnitude in >> the spectrum of my operator without being able to determine the reason. >> >>> The eigenvectors look like small artifacts at the wall-inflow or >> wall-outflow corners with zero anywhere else but I do not know how to >> interpret this. >> >>> Equations are time-harmonic linearized Navier-Stokes to which a >> forcing is applied, there?s no time-marching. >> >>> >> >>> You mean you are in frequency domain? >> >>> >> >>> >> >>> Matrix is formed with a MPIAIJ type. The formulation is >> incompressible, in complex arithmetic and the 2D physical domain is mapped >> to a logically rectangular, >> >>> >> >>> This kind of messes up the null space that AMG depends on but AMG >> theory is gone for NS anyway. >> >>> >> >>> regular collocated grid with a high-order finite difference method. >> >>> I determine the ownership of the rows/degrees of freedom of the >> matrix with PetscSplitOwnership and I?m not using DMDA. >> >>> >> >>> Our iterative solvers are probably not going to work well on this but >> you should test hypre also (-pc_type hypre -pc_hypre_type boomeramg). You >> need to configure PETSc to download hypre. >> >>> >> >>> Mark >> >>> >> >>> >> >>> The Fortran application code is memory-leak free and has undergone a >> strict verification/validation procedure for different variations of the >> PDEs. >> >>> >> >>> If there?s any problem with the matrix what could help for the >> diagnostic? At this point I?m running out of ideas so I would really >> appreciate additional suggestions and discussions. >> >>> >> >>> Thanks for your continued support, >> >>> >> >>> >> >>> Thibaut >> >> >> >> >> >> -- >> >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> >> -- Norbert Wiener >> >> >> >> https://www.cse.buffalo.edu/~knepley/ >> > >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From jychang48 at gmail.com Mon Nov 5 14:22:27 2018 From: jychang48 at gmail.com (Justin Chang) Date: Mon, 5 Nov 2018 13:22:27 -0700 Subject: [petsc-users] TAOIPM for AC optimal power flow Message-ID: Hi everyone, I am working on a generic AC optimal power flow solver, and I hope to use DMNetwork's data structure and TAO's optimization solvers for this purpose. Last time I inquired about IPM (maybe 3-4 years ago) I was told that it's not suitable for large-scale networks, which is the direction we're hoping to go here at NREL. I see right now from the documentation website: "This algorithm is more of a place-holder for future constrained optimization algorithms and should not yet be used for large problems or production code." Is there any plans at the moment to have a high performance implementation of IPM in TAO? It would be nice to have a purely PETSc code instead of interfacing to something like IPOPT for the optimization. Thanks, Justin -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 5 14:47:29 2018 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 5 Nov 2018 15:47:29 -0500 Subject: [petsc-users] TAOIPM for AC optimal power flow In-Reply-To: References: Message-ID: On Mon, Nov 5, 2018 at 3:23 PM Justin Chang via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hi everyone, > > I am working on a generic AC optimal power flow solver, and I hope to use > DMNetwork's data structure and TAO's optimization solvers for this purpose. > Last time I inquired about IPM (maybe 3-4 years ago) I was told that it's > not suitable for large-scale networks, which is the direction we're hoping > to go here at NREL. I see right now from the documentation website: > > "This algorithm is more of a place-holder for future constrained > optimization algorithms and should not yet be used for large problems or > production code." > > Is there any plans at the moment to have a high performance implementation > of IPM in TAO? It would be nice to have a purely PETSc code instead of > interfacing to something like IPOPT for the optimization. > Preliminary question. Are there reasons we expect IPM to be better than active sets? Thanks, Matt > Thanks, > Justin > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.appel17 at imperial.ac.uk Mon Nov 5 15:10:40 2018 From: t.appel17 at imperial.ac.uk (Appel, Thibaut) Date: Mon, 5 Nov 2018 21:10:40 +0000 Subject: [petsc-users] DIVERGED_NANORING with PC GAMG In-Reply-To: References: <34D20379-DEAC-4E98-8FA8-E7EC2831BE6D@ic.ac.uk> <991561e5-1b9a-dbb2-333d-501251967077@imperial.ac.uk> <16cb9915-2b01-00ae-5060-966c0b4b48d3@imperial.ac.uk>, Message-ID: "Local" as in serial? Thibaut On 5 Nov 2018, at 20:12, Mark Adams > wrote: On Mon, Nov 5, 2018 at 12:50 PM Thibaut Appel > wrote: Hi Mark, Yes it doesn't seem to be usable. Unfortunately we're aiming to do 3D so direct solvers are not a viable solution and PETSc' ILU is not parallel and we can't use HYPRE (complex arithmetic) I think SuperLU has a parallel ILU but in my opinion parallel ILU is not a big deal. Neither is optimal and the math win (faster convergence) with parallel is offset by the cost of synchronization, in some form, for a true parallel ILU. So I think the PETSc default gmres/(local)ILU is your best option. Thibaut On 01/11/2018 20:42, Mark Adams wrote: On Wed, Oct 31, 2018 at 8:11 PM Smith, Barry F. > wrote: > On Oct 31, 2018, at 5:39 PM, Appel, Thibaut via petsc-users > wrote: > > Well yes naturally for the residual but adding -ksp_true_residual just gives > > 0 KSP unpreconditioned resid norm 3.583290589961e+00 true resid norm 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP unpreconditioned resid norm 0.000000000000e+00 true resid norm 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 > Linear solve converged due to CONVERGED_ATOL iterations 1 Very bad stuff is happening in the preconditioner. The preconditioner must have a null space (which it shouldn't have to be a useful preconditioner). Yea, you are far away from an optimal preconditioner for this system. In low frequency (indefinite) Helmholtz is very very hard. Now, something very bad is going on here but even if you fix it standard AMG is not good for these problems. I would use direct solvers or grind away it with ILU. > > Mark - if that helps - a Poisson equation is used for the pressure so the Helmholtz is the same as for the velocity in the interior. > > Thibaut > >> Le 31 oct. 2018 ? 21:05, Mark Adams > a ?crit : >> >> These are indefinite (bad) Helmholtz problems. Right? >> >> On Wed, Oct 31, 2018 at 2:38 PM Matthew Knepley > wrote: >> On Wed, Oct 31, 2018 at 2:13 PM Thibaut Appel > wrote: >> Hi Mark, Matthew, >> >> Thanks for taking the time. >> >> 1) You're not suggesting having -fieldsplit_X_ksp_type fgmres for each field, are you? >> >> 2) No, the matrix has pressure in one of the fields. Here it's a 2D problem (but we're also doing 3D), the unknowns are (p,u,v) and those are my 3 fields. We are dealing with subsonic/transsonic flows so it is convection dominated indeed. >> >> 3) We are in frequency domain with respect to time, i.e. \partial{phi}/\partial{t} = -i*omega*phi. >> >> 4) Hypre is unfortunately not an option since we are in complex arithmetic. >> >> >> >>> I'm not sure about "-fieldsplit_pc_type gamg" GAMG should work on one block, and hence be a subpc. I'm not up on fieldsplit syntax. >> According to the online manual page this syntax applies the suffix to all the defined fields? >> >> >> >>> Mark is correct. I wanted you to change the smoother. He shows how to change it to Richardson (make sure you add the self-scale option), which is probably the best choice. >>> >>> Thanks, >>> >>> Matt >> >> You did tell me to set it to GMRES if I'm not mistaken, that's why I tried "-fieldsplit_mg_levels_ksp_type gmres" (mentioned in the email). Also, it wasn't clear whether these should be applied to each block or the whole system, as the online manual pages + .pdf manual barely mention smoothers and how to manipulate MG objects with KSP/PC, this especially with PCFIELDSPLIT where examples are scarce. >> >> From what I can gather from your suggestions I tried (lines with X are repeated for X={0,1,2}) >> >> This looks good. How can an identically zero vector produce a 0 residual? You should always monitor with >> >> -ksp_monitor_true_residual. >> >> Thanks, >> >> Matt >> -ksp_view_pre -ksp_monitor -ksp_converged_reason \ >> -ksp_type fgmres -ksp_rtol 1.0e-8 \ >> -pc_type fieldsplit \ >> -pc_fieldsplit_type multiplicative \ >> -pc_fieldsplit_block_size 3 \ >> -pc_fieldsplit_0_fields 0 \ >> -pc_fieldsplit_1_fields 1 \ >> -pc_fieldsplit_2_fields 2 \ >> -fieldsplit_X_pc_type gamg \ >> -fieldsplit_X_ksp_type gmres \ >> -fieldsplit_X_ksp_rtol 1e-10 \ >> -fieldsplit_X_mg_levels_ksp_type richardson \ >> -fieldsplit_X_mg_levels_pc_type sor \ >> -fieldsplit_X_pc_gamg_agg_nsmooths 0 \ >> -fieldsplit_X_mg_levels_ksp_richardson_self_scale \ >> -log_view >> >> which yields >> >> KSP Object: 1 MPI processes >> type: fgmres >> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >> happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-08, absolute=1e-50, divergence=10000. >> left preconditioning >> using DEFAULT norm type for convergence test >> PC Object: 1 MPI processes >> type: fieldsplit >> PC has not been set up so information may be incomplete >> FieldSplit with MULTIPLICATIVE composition: total splits = 3, blocksize = 3 >> Solver info for each split is in the following KSP objects: >> Split number 0 Fields 0 >> KSP Object: (fieldsplit_0_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using DEFAULT norm type for convergence test >> PC Object: (fieldsplit_0_) 1 MPI processes >> type not yet set >> PC has not been set up so information may be incomplete >> Split number 1 Fields 1 >> KSP Object: (fieldsplit_1_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using DEFAULT norm type for convergence test >> PC Object: (fieldsplit_1_) 1 MPI processes >> type not yet set >> PC has not been set up so information may be incomplete >> Split number 2 Fields 2 >> KSP Object: (fieldsplit_2_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using DEFAULT norm type for convergence test >> PC Object: (fieldsplit_2_) 1 MPI processes >> type not yet set >> PC has not been set up so information may be incomplete >> linear system matrix = precond matrix: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=52500, cols=52500 >> total: nonzeros=1127079, allocated nonzeros=1128624 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> 0 KSP Residual norm 3.583290589961e+00 >> 1 KSP Residual norm 0.000000000000e+00 >> Linear solve converged due to CONVERGED_ATOL iterations 1 >> >> so something must not be set correctly. The solution is identically zero everywhere. >> >> Is that option list what you meant? If you could let me know what should be corrected. >> >> >> >> Thanks for your support, >> >> >> >> Thibaut >> >> >> >> On 31/10/2018 16:43, Mark Adams wrote: >>> >>> >>> On Tue, Oct 30, 2018 at 5:23 PM Appel, Thibaut via petsc-users > wrote: >>> Dear users, >>> >>> Following a suggestion from Matthew Knepley I?ve been trying to apply fieldsplit/gamg for my set of PDEs but I?m still encountering issues despite various tests. pc_gamg simply won?t start. >>> Note that direct solvers always yield the correct, physical result. >>> Removing the fieldsplit to focus on the gamg bit and trying to solve the linear system on a modest size problem still gives, with >>> >>> '-ksp_monitor -ksp_rtol 1.0e-10 -ksp_gmres_restart 300 -ksp_type gmres -pc_type gamg' >>> >>> [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- >>> [3]PETSC ERROR: Petsc has generated inconsistent data >>> [3]PETSC ERROR: Have un-symmetric graph (apparently). Use '-(null)pc_gamg_sym_graph true' to symetrize the graph or '-(null)pc_gamg_threshold -1' if the matrix is structurally symmetric. >>> >>> And since then, after adding '-pc_gamg_sym_graph true' I have been getting >>> [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- >>> [0]PETSC ERROR: Petsc has generated inconsistent data >>> [0]PETSC ERROR: Eigen estimator failed: DIVERGED_NANORINF at iteration >>> >>> -ksp_chebyshev_esteig_noisy 0/1 does not change anything >>> >>> Knowing that Chebyshev eigen estimator needs a positive spectrum I tried ?-mg_levels_ksp_type gmres? but iterations would just go on endlessly. >>> >>> This is OK, but you need to use '-ksp_type fgmres' (this could be why it is failing ...). >>> >>> It looks like your matrix is 1) just the velocity field and 2) very unsymmetric (eg, convection dominated). I would start with ?-mg_levels_ksp_type richardson -mg_levels_pc_type sor?. >>> >>> I would also start with unsmoothed aggregation: '-pc_gamg_nsmooths 0' >>> >>> >>> It seems that I have indeed eigenvalues of rather high magnitude in the spectrum of my operator without being able to determine the reason. >>> The eigenvectors look like small artifacts at the wall-inflow or wall-outflow corners with zero anywhere else but I do not know how to interpret this. >>> Equations are time-harmonic linearized Navier-Stokes to which a forcing is applied, there?s no time-marching. >>> >>> You mean you are in frequency domain? >>> >>> >>> Matrix is formed with a MPIAIJ type. The formulation is incompressible, in complex arithmetic and the 2D physical domain is mapped to a logically rectangular, >>> >>> This kind of messes up the null space that AMG depends on but AMG theory is gone for NS anyway. >>> >>> regular collocated grid with a high-order finite difference method. >>> I determine the ownership of the rows/degrees of freedom of the matrix with PetscSplitOwnership and I?m not using DMDA. >>> >>> Our iterative solvers are probably not going to work well on this but you should test hypre also (-pc_type hypre -pc_hypre_type boomeramg). You need to configure PETSc to download hypre. >>> >>> Mark >>> >>> >>> The Fortran application code is memory-leak free and has undergone a strict verification/validation procedure for different variations of the PDEs. >>> >>> If there?s any problem with the matrix what could help for the diagnostic? At this point I?m running out of ideas so I would really appreciate additional suggestions and discussions. >>> >>> Thanks for your continued support, >>> >>> >>> Thibaut >> >> >> -- >> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jychang48 at gmail.com Mon Nov 5 15:43:54 2018 From: jychang48 at gmail.com (Justin Chang) Date: Mon, 5 Nov 2018 14:43:54 -0700 Subject: [petsc-users] TAOIPM for AC optimal power flow In-Reply-To: References: Message-ID: I don't know them off the top of my head. The power systems guys I'm working with are still formulating the math for their ACOPF framework, but all I know is that it's going to eventually need optimization solvers that handle not only simply bound constraints but inequality/equality constraints as well. At this point I don't really care whether it's IPM or active-set. On Mon, Nov 5, 2018 at 1:47 PM Matthew Knepley wrote: > On Mon, Nov 5, 2018 at 3:23 PM Justin Chang via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Hi everyone, >> >> I am working on a generic AC optimal power flow solver, and I hope to use >> DMNetwork's data structure and TAO's optimization solvers for this purpose. >> Last time I inquired about IPM (maybe 3-4 years ago) I was told that it's >> not suitable for large-scale networks, which is the direction we're hoping >> to go here at NREL. I see right now from the documentation website: >> >> "This algorithm is more of a place-holder for future constrained >> optimization algorithms and should not yet be used for large problems or >> production code." >> >> Is there any plans at the moment to have a high performance >> implementation of IPM in TAO? It would be nice to have a purely PETSc code >> instead of interfacing to something like IPOPT for the optimization. >> > > Preliminary question. Are there reasons we expect IPM to be better than > active sets? > > Thanks, > > Matt > > >> Thanks, >> Justin >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Nov 5 15:50:12 2018 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 5 Nov 2018 16:50:12 -0500 Subject: [petsc-users] DIVERGED_NANORING with PC GAMG In-Reply-To: References: <34D20379-DEAC-4E98-8FA8-E7EC2831BE6D@ic.ac.uk> <991561e5-1b9a-dbb2-333d-501251967077@imperial.ac.uk> <16cb9915-2b01-00ae-5060-966c0b4b48d3@imperial.ac.uk> Message-ID: On Mon, Nov 5, 2018 at 4:11 PM Appel, Thibaut wrote: > "Local" as in serial? > Block Jacobi with ILU as the solver on each block. Each block corresponds to an MPI process by default. So it is completely parallel it is just not a true ILU. I the limit of one equation per processor it is just (point) Jacobi. > > Thibaut > > On 5 Nov 2018, at 20:12, Mark Adams wrote: > > > > On Mon, Nov 5, 2018 at 12:50 PM Thibaut Appel > wrote: > >> Hi Mark, >> >> Yes it doesn't seem to be usable. Unfortunately we're aiming to do 3D so >> direct solvers are not a viable solution and PETSc' ILU is not parallel and >> we can't use HYPRE (complex arithmetic) >> > > I think SuperLU has a parallel ILU but in my opinion parallel ILU is not a > big deal. Neither is optimal and the math win (faster convergence) with > parallel is offset by the cost of synchronization, in some form, for a true > parallel ILU. So I think the PETSc default gmres/(local)ILU is your > best option. > > >> Thibaut >> On 01/11/2018 20:42, Mark Adams wrote: >> >> >> >> On Wed, Oct 31, 2018 at 8:11 PM Smith, Barry F. >> wrote: >> >>> >>> >>> > On Oct 31, 2018, at 5:39 PM, Appel, Thibaut via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> > >>> > Well yes naturally for the residual but adding -ksp_true_residual just >>> gives >>> > >>> > 0 KSP unpreconditioned resid norm 3.583290589961e+00 true resid norm >>> 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 >>> > 1 KSP unpreconditioned resid norm 0.000000000000e+00 true resid norm >>> 3.583290589961e+00 ||r(i)||/||b|| 1.000000000000e+00 >>> > Linear solve converged due to CONVERGED_ATOL iterations 1 >>> >>> Very bad stuff is happening in the preconditioner. The preconditioner >>> must have a null space (which it shouldn't have to be a useful >>> preconditioner). >>> >> >> Yea, you are far away from an optimal preconditioner for this system. In >> low frequency (indefinite) Helmholtz is very very hard. Now, something very >> bad is going on here but even if you fix it standard AMG is not good for >> these problems. I would use direct solvers or grind away it with ILU. >> >> >>> >>> > >>> > Mark - if that helps - a Poisson equation is used for the pressure so >>> the Helmholtz is the same as for the velocity in the interior. >>> > >>> > Thibaut >>> > >>> >> Le 31 oct. 2018 ? 21:05, Mark Adams a ?crit : >>> >> >>> >> These are indefinite (bad) Helmholtz problems. Right? >>> >> >>> >> On Wed, Oct 31, 2018 at 2:38 PM Matthew Knepley >>> wrote: >>> >> On Wed, Oct 31, 2018 at 2:13 PM Thibaut Appel < >>> t.appel17 at imperial.ac.uk> wrote: >>> >> Hi Mark, Matthew, >>> >> >>> >> Thanks for taking the time. >>> >> >>> >> 1) You're not suggesting having -fieldsplit_X_ksp_type fgmres for >>> each field, are you? >>> >> >>> >> 2) No, the matrix has pressure in one of the fields. Here it's a 2D >>> problem (but we're also doing 3D), the unknowns are (p,u,v) and those are >>> my 3 fields. We are dealing with subsonic/transsonic flows so it is >>> convection dominated indeed. >>> >> >>> >> 3) We are in frequency domain with respect to time, i.e. >>> \partial{phi}/\partial{t} = -i*omega*phi. >>> >> >>> >> 4) Hypre is unfortunately not an option since we are in complex >>> arithmetic. >>> >> >>> >> >>> >> >>> >>> I'm not sure about "-fieldsplit_pc_type gamg" GAMG should work on >>> one block, and hence be a subpc. I'm not up on fieldsplit syntax. >>> >> According to the online manual page this syntax applies the suffix to >>> all the defined fields? >>> >> >>> >> >>> >> >>> >>> Mark is correct. I wanted you to change the smoother. He shows how >>> to change it to Richardson (make sure you add the self-scale option), which >>> is probably the best choice. >>> >>> >>> >>> Thanks, >>> >>> >>> >>> Matt >>> >> >>> >> You did tell me to set it to GMRES if I'm not mistaken, that's why I >>> tried "-fieldsplit_mg_levels_ksp_type gmres" (mentioned in the email). >>> Also, it wasn't clear whether these should be applied to each block or the >>> whole system, as the online manual pages + .pdf manual barely mention >>> smoothers and how to manipulate MG objects with KSP/PC, this especially >>> with PCFIELDSPLIT where examples are scarce. >>> >> >>> >> From what I can gather from your suggestions I tried (lines with X >>> are repeated for X={0,1,2}) >>> >> >>> >> This looks good. How can an identically zero vector produce a 0 >>> residual? You should always monitor with >>> >> >>> >> -ksp_monitor_true_residual. >>> >> >>> >> Thanks, >>> >> >>> >> Matt >>> >> -ksp_view_pre -ksp_monitor -ksp_converged_reason \ >>> >> -ksp_type fgmres -ksp_rtol 1.0e-8 \ >>> >> -pc_type fieldsplit \ >>> >> -pc_fieldsplit_type multiplicative \ >>> >> -pc_fieldsplit_block_size 3 \ >>> >> -pc_fieldsplit_0_fields 0 \ >>> >> -pc_fieldsplit_1_fields 1 \ >>> >> -pc_fieldsplit_2_fields 2 \ >>> >> -fieldsplit_X_pc_type gamg \ >>> >> -fieldsplit_X_ksp_type gmres \ >>> >> -fieldsplit_X_ksp_rtol 1e-10 \ >>> >> -fieldsplit_X_mg_levels_ksp_type richardson \ >>> >> -fieldsplit_X_mg_levels_pc_type sor \ >>> >> -fieldsplit_X_pc_gamg_agg_nsmooths 0 \ >>> >> -fieldsplit_X_mg_levels_ksp_richardson_self_scale \ >>> >> -log_view >>> >> >>> >> which yields >>> >> >>> >> KSP Object: 1 MPI processes >>> >> type: fgmres >>> >> restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> >> happy breakdown tolerance 1e-30 >>> >> maximum iterations=10000, initial guess is zero >>> >> tolerances: relative=1e-08, absolute=1e-50, divergence=10000. >>> >> left preconditioning >>> >> using DEFAULT norm type for convergence test >>> >> PC Object: 1 MPI processes >>> >> type: fieldsplit >>> >> PC has not been set up so information may be incomplete >>> >> FieldSplit with MULTIPLICATIVE composition: total splits = 3, >>> blocksize = 3 >>> >> Solver info for each split is in the following KSP objects: >>> >> Split number 0 Fields 0 >>> >> KSP Object: (fieldsplit_0_) 1 MPI processes >>> >> type: preonly >>> >> maximum iterations=10000, initial guess is zero >>> >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>> >> left preconditioning >>> >> using DEFAULT norm type for convergence test >>> >> PC Object: (fieldsplit_0_) 1 MPI processes >>> >> type not yet set >>> >> PC has not been set up so information may be incomplete >>> >> Split number 1 Fields 1 >>> >> KSP Object: (fieldsplit_1_) 1 MPI processes >>> >> type: preonly >>> >> maximum iterations=10000, initial guess is zero >>> >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>> >> left preconditioning >>> >> using DEFAULT norm type for convergence test >>> >> PC Object: (fieldsplit_1_) 1 MPI processes >>> >> type not yet set >>> >> PC has not been set up so information may be incomplete >>> >> Split number 2 Fields 2 >>> >> KSP Object: (fieldsplit_2_) 1 MPI processes >>> >> type: preonly >>> >> maximum iterations=10000, initial guess is zero >>> >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>> >> left preconditioning >>> >> using DEFAULT norm type for convergence test >>> >> PC Object: (fieldsplit_2_) 1 MPI processes >>> >> type not yet set >>> >> PC has not been set up so information may be incomplete >>> >> linear system matrix = precond matrix: >>> >> Mat Object: 1 MPI processes >>> >> type: seqaij >>> >> rows=52500, cols=52500 >>> >> total: nonzeros=1127079, allocated nonzeros=1128624 >>> >> total number of mallocs used during MatSetValues calls =0 >>> >> not using I-node routines >>> >> 0 KSP Residual norm 3.583290589961e+00 >>> >> 1 KSP Residual norm 0.000000000000e+00 >>> >> Linear solve converged due to CONVERGED_ATOL iterations 1 >>> >> >>> >> so something must not be set correctly. The solution is identically >>> zero everywhere. >>> >> >>> >> Is that option list what you meant? If you could let me know what >>> should be corrected. >>> >> >>> >> >>> >> >>> >> Thanks for your support, >>> >> >>> >> >>> >> >>> >> Thibaut >>> >> >>> >> >>> >> >>> >> On 31/10/2018 16:43, Mark Adams wrote: >>> >>> >>> >>> >>> >>> On Tue, Oct 30, 2018 at 5:23 PM Appel, Thibaut via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> >>> Dear users, >>> >>> >>> >>> Following a suggestion from Matthew Knepley I?ve been trying to >>> apply fieldsplit/gamg for my set of PDEs but I?m still encountering issues >>> despite various tests. pc_gamg simply won?t start. >>> >>> Note that direct solvers always yield the correct, physical result. >>> >>> Removing the fieldsplit to focus on the gamg bit and trying to solve >>> the linear system on a modest size problem still gives, with >>> >>> >>> >>> '-ksp_monitor -ksp_rtol 1.0e-10 -ksp_gmres_restart 300 -ksp_type >>> gmres -pc_type gamg' >>> >>> >>> >>> [3]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> >>> [3]PETSC ERROR: Petsc has generated inconsistent data >>> >>> [3]PETSC ERROR: Have un-symmetric graph (apparently). Use >>> '-(null)pc_gamg_sym_graph true' to symetrize the graph or >>> '-(null)pc_gamg_threshold -1' if the matrix is structurally symmetric. >>> >>> >>> >>> And since then, after adding '-pc_gamg_sym_graph true' I have been >>> getting >>> >>> [0]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> >>> [0]PETSC ERROR: Petsc has generated inconsistent data >>> >>> [0]PETSC ERROR: Eigen estimator failed: DIVERGED_NANORINF at >>> iteration >>> >>> >>> >>> -ksp_chebyshev_esteig_noisy 0/1 does not change anything >>> >>> >>> >>> Knowing that Chebyshev eigen estimator needs a positive spectrum I >>> tried ?-mg_levels_ksp_type gmres? but iterations would just go on endlessly. >>> >>> >>> >>> This is OK, but you need to use '-ksp_type fgmres' (this could be >>> why it is failing ...). >>> >>> >>> >>> It looks like your matrix is 1) just the velocity field and 2) very >>> unsymmetric (eg, convection dominated). I would start with >>> ?-mg_levels_ksp_type richardson -mg_levels_pc_type sor?. >>> >>> >>> >>> I would also start with unsmoothed aggregation: '-pc_gamg_nsmooths >>> 0' >>> >>> >>> >>> >>> >>> It seems that I have indeed eigenvalues of rather high magnitude in >>> the spectrum of my operator without being able to determine the reason. >>> >>> The eigenvectors look like small artifacts at the wall-inflow or >>> wall-outflow corners with zero anywhere else but I do not know how to >>> interpret this. >>> >>> Equations are time-harmonic linearized Navier-Stokes to which a >>> forcing is applied, there?s no time-marching. >>> >>> >>> >>> You mean you are in frequency domain? >>> >>> >>> >>> >>> >>> Matrix is formed with a MPIAIJ type. The formulation is >>> incompressible, in complex arithmetic and the 2D physical domain is mapped >>> to a logically rectangular, >>> >>> >>> >>> This kind of messes up the null space that AMG depends on but AMG >>> theory is gone for NS anyway. >>> >>> >>> >>> regular collocated grid with a high-order finite difference method. >>> >>> I determine the ownership of the rows/degrees of freedom of the >>> matrix with PetscSplitOwnership and I?m not using DMDA. >>> >>> >>> >>> Our iterative solvers are probably not going to work well on this >>> but you should test hypre also (-pc_type hypre -pc_hypre_type boomeramg). >>> You need to configure PETSc to download hypre. >>> >>> >>> >>> Mark >>> >>> >>> >>> >>> >>> The Fortran application code is memory-leak free and has undergone a >>> strict verification/validation procedure for different variations of the >>> PDEs. >>> >>> >>> >>> If there?s any problem with the matrix what could help for the >>> diagnostic? At this point I?m running out of ideas so I would really >>> appreciate additional suggestions and discussions. >>> >>> >>> >>> Thanks for your continued support, >>> >>> >>> >>> >>> >>> Thibaut >>> >> >>> >> >>> >> -- >>> >> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> >> -- Norbert Wiener >>> >> >>> >> https://www.cse.buffalo.edu/~knepley/ >>> > >>> >>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From francesco_magaletti at fastwebnet.it Tue Nov 6 04:11:15 2018 From: francesco_magaletti at fastwebnet.it (Francesco Magaletti) Date: Tue, 6 Nov 2018 11:11:15 +0100 Subject: [petsc-users] need help with vector interpolation on nonuniform DMDA grids Message-ID: <27356998-760A-431D-BB28-9FA41CA0ED40@fastwebnet.it> Dear all, I would like to ask you if is there an easy and efficient way in parallel (by using PetsC functions) to interpolate a DMDA vector associated with a nonuniform 1D grid to another DMDA vector with the same length but associated with a different nonuniform grid. Let me rephrase it to be as clearer as I can: I have two structured nonuniform 1D grids with coordinate vectors x[i] and y[i]. Both the domains have been discretized with the same number of points, but the coordinate vectors x and y are different. I have a discretized field u[i] = u(x[i]) and I would like to use these point values to evaluate the values u(y[i]) in the points of the second grid. I read on the manual pages that functions like DMCreateInterpolation or similar work only with different but uniform DMDAs. Did I understand correctly? A naive approach, with a serial code, could be to find the points x[i] and x[i+1] that surround the point y[j] for every j and then simply linear interpolating the values u[i] and u[i+1]. I suspect that this is not the most efficient way to do it. Moreover it won?t work in parallel since, in principle, I do not know beforehand how many ghost nodes could be necessary to perform all the interpolations. Thank you in advance for your help! Francesco From imilian.hartig at gmail.com Tue Nov 6 04:14:08 2018 From: imilian.hartig at gmail.com (Maximilian Hartig) Date: Tue, 6 Nov 2018 11:14:08 +0100 Subject: [petsc-users] Segmentation violation in DMPlexDistribute on MacOS Message-ID: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> Hello petsc team, I?ve run into an issue using DMPlexDistribute on Mac OS 10.14.1. Here?s my dilemma: - Compiling with petsc in debug mode (?with-debugging=true); the problem does not occur - On two other (linux) machines, the error does not occur - valgrind is not available for Mac OS Mojave at this point in time and it is unclear until when it will be - SEGV occurs with mesh from gmsh file and DMPlexCreateBoxMesh copied from https://www.mcs.anl.gov/petsc/petsc-current/src/dm/impls/plex/examples/tutorials/ex1.c.html - I use the lastest petsc on maint/ branch from git - petsc itself compiles and runs its tests (make [?] check) without problems on one and two processors. Here?s the code I run for testing: #include #include int main(int argc, char **args) { PetscErrorCode ierr; PetscPartitioner part; DM dm, distributedDm; const char help[] = "no help"; PetscInitialize(&argc,&args,(char*)0,help); ierr = DMPlexCreateFromFile(PETSC_COMM_WORLD,"cylinder_smallring.msh",PETSC_TRUE,&dm); CHKERRQ(ierr); // ierr = DMPlexCreateBoxMesh(PETSC_COMM_WORLD, 2, PETSC_TRUE, NULL, NULL, NULL, NULL, PETSC_TRUE, &dm); CHKERRQ(ierr); ierr = DMPlexGetPartitioner(dm, &part); CHKERRQ(ierr); ierr = PetscPartitionerSetFromOptions(part); CHKERRQ(ierr); // ierr = PetscPartitionerSetType(part, PETSCPARTITIONERPARMETIS); CHKERRQ(ierr); ierr = DMPlexDistribute(dm, 0, NULL, &distributedDm); CHKERRQ(ierr); if(distributedDm){ ierr = DMDestroy(&dm); CHKERRQ(ierr); dm = distributedDm; } ierr = DMDestroy(&dm); CHKERRQ(ierr); PetscFinalize(); return 0; } Here?s the error message: scc-wkit-clx-237-216:cimply_playground maximilianhartig$ ~/petsc/arch-darwin-c/bin/mpiexec -np 2 ./meshTest [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more information on the crash. [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown [0]PETSC ERROR: ./meshTest on a arch-darwin-c named Maximilians-MacBook.local by maximilianhartig Tue Nov 6 11:02:14 2018 [0]PETSC ERROR: Configure options PETSC_ARCH=arch-darwin-c --with-debugging=false --download-mumps --download-chacoa --download-scalapack --download-mpich --with-fc=gfortran-mp-6 --download-triangle --with-cc=gcc-mp-6 --with-cxx=g++-mp-6 --download-fblaspack --download-parmetis --download-metis --download-hdf5 --download chaco [0]PETSC ERROR: #1 User provided function() line 0 in unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 I don?t know how to proceed with this and would be thankful for a pointer in the right direction. Thanks, Max -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Tue Nov 6 04:23:40 2018 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Tue, 6 Nov 2018 13:23:40 +0300 Subject: [petsc-users] Segmentation violation in DMPlexDistribute on MacOS In-Reply-To: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> References: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> Message-ID: > > > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > Have you tried what PETSc suggests here (-on_error_attach_debugger)? You may need to specify "-on_error_attach_debugger lldb" on a MacOS -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Nov 6 08:14:59 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 6 Nov 2018 09:14:59 -0500 Subject: [petsc-users] need help with vector interpolation on nonuniform DMDA grids In-Reply-To: <27356998-760A-431D-BB28-9FA41CA0ED40@fastwebnet.it> References: <27356998-760A-431D-BB28-9FA41CA0ED40@fastwebnet.it> Message-ID: On Tue, Nov 6, 2018 at 5:11 AM Francesco Magaletti via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear all, > > I would like to ask you if is there an easy and efficient way in parallel > (by using PetsC functions) to interpolate a DMDA vector associated with a > nonuniform 1D grid to another DMDA vector with the same length but > associated with a different nonuniform grid. > > Let me rephrase it to be as clearer as I can: > > I have two structured nonuniform 1D grids with coordinate vectors x[i] and > y[i]. Both the domains have been discretized with the same number of > points, but the coordinate vectors x and y are different. I have a > discretized field u[i] = u(x[i]) and I would like to use these point values > to evaluate the values u(y[i]) in the points of the second grid. > > I read on the manual pages that functions like DMCreateInterpolation or > similar work only with different but uniform DMDAs. Did I understand > correctly? > > A naive approach, with a serial code, could be to find the points x[i] and > x[i+1] that surround the point y[j] for every j and then simply linear > interpolating the values u[i] and u[i+1]. I suspect that this is not the > most efficient way to do it. Moreover it won?t work in parallel since, in > principle, I do not know beforehand how many ghost nodes could be necessary > to perform all the interpolations. > > Thank you in advance for your help! > This has not been written, but is not that hard. You would first bin the points into cells, and then use the interpolant from the discretization. This is how we do it in the unstructured case. Actually, if you wrote your nonuniform 1D meshes as DMPlexes, I think it would work right now (have not tested). Thanks, Matt > Francesco -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Nov 6 08:21:33 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 6 Nov 2018 09:21:33 -0500 Subject: [petsc-users] Segmentation violation in DMPlexDistribute on MacOS In-Reply-To: References: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> Message-ID: On Tue, Nov 6, 2018 at 5:24 AM Stefano Zampini via petsc-users < petsc-users at mcs.anl.gov> wrote: > >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> > > Have you tried what PETSc suggests here (-on_error_attach_debugger)? > You may need to specify "-on_error_attach_debugger lldb" on a MacOS > 1) So that error occurs with the smallest BoxMesh? That would be 6 tets. 2) I would really need a stack to tell you anything. You can get this by running in the debugger. This is something we run every night (exactly this), so I suspect its some inconsistency in that build. Thanks, Matt > -- > Stefano > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From imilian.hartig at gmail.com Tue Nov 6 08:37:00 2018 From: imilian.hartig at gmail.com (Maximilian Hartig) Date: Tue, 6 Nov 2018 15:37:00 +0100 Subject: [petsc-users] Segmentation violation in DMPlexDistribute on MacOS In-Reply-To: References: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> Message-ID: <4F0544B5-C0BA-466E-AD27-018A7F045EED@gmail.com> lldb returns the following: (lldb) process attach --pid 1082 Process 1082 stopped * thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGSTOP frame #0: 0x00007fff5aa5c876 libsystem_kernel.dylib`__semwait_signal + 10 libsystem_kernel.dylib`__semwait_signal: -> 0x7fff5aa5c876 <+10>: jae 0x7fff5aa5c880 ; <+20> 0x7fff5aa5c878 <+12>: movq %rax, %rdi 0x7fff5aa5c87b <+15>: jmp 0x7fff5aa58e31 ; cerror 0x7fff5aa5c880 <+20>: retq Target 0: (meshTest) stopped. Executable module set to "/Users/maximilianhartig/cimply_playground/./meshTest". Architecture set to: x86_64h-apple-macosx. I do unfortunately not know what to make of this. Do I have a memory jump that?s causing an error here? Google did not provide any useful results (at least to my untrained eye). Using gdb I usually step through the program and print out variables until I?ve located the error. But this does not seem an option here. Please excuse my ignorance regarding debuggers. Thanks, Max > On 6. Nov 2018, at 11:23, Stefano Zampini wrote: > > > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > > Have you tried what PETSc suggests here (-on_error_attach_debugger)? > You may need to specify "-on_error_attach_debugger lldb" on a MacOS > > > -- > Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From wence at gmx.li Tue Nov 6 08:39:45 2018 From: wence at gmx.li (Lawrence Mitchell) Date: Tue, 6 Nov 2018 14:39:45 +0000 Subject: [petsc-users] Segmentation violation in DMPlexDistribute on MacOS In-Reply-To: <4F0544B5-C0BA-466E-AD27-018A7F045EED@gmail.com> References: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> <4F0544B5-C0BA-466E-AD27-018A7F045EED@gmail.com> Message-ID: > On 6 Nov 2018, at 14:37, Maximilian Hartig via petsc-users wrote: > > lldb returns the following: > > (lldb) process attach --pid 1082 > Process 1082 stopped > * thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGSTOP > frame #0: 0x00007fff5aa5c876 libsystem_kernel.dylib`__semwait_signal + 10 > libsystem_kernel.dylib`__semwait_signal: > -> 0x7fff5aa5c876 <+10>: jae 0x7fff5aa5c880 ; <+20> > 0x7fff5aa5c878 <+12>: movq %rax, %rdi > 0x7fff5aa5c87b <+15>: jmp 0x7fff5aa58e31 ; cerror > 0x7fff5aa5c880 <+20>: retq > Target 0: (meshTest) stopped. > > Executable module set to "/Users/maximilianhartig/cimply_playground/./meshTest". > Architecture set to: x86_64h-apple-macosx. > > I do unfortunately not know what to make of this. Do I have a memory jump that?s causing an error here? Google did not provide any useful results (at least to my untrained eye). Using gdb I usually step through the program and print out variables until I?ve located the error. But this does not seem an option here. Please excuse my ignorance regarding debuggers. PETSc uses a signal handler to attach the debugger. You need to go up the stack until you get to the point where the signal handler was invoked (via CHKERRQ or similar). Then you should be able to inspect the relevant variables. Cheers, Lawrence From knepley at gmail.com Tue Nov 6 08:42:55 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 6 Nov 2018 09:42:55 -0500 Subject: [petsc-users] Segmentation violation in DMPlexDistribute on MacOS In-Reply-To: <4F0544B5-C0BA-466E-AD27-018A7F045EED@gmail.com> References: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> <4F0544B5-C0BA-466E-AD27-018A7F045EED@gmail.com> Message-ID: On Tue, Nov 6, 2018 at 9:38 AM Maximilian Hartig via petsc-users < petsc-users at mcs.anl.gov> wrote: > lldb returns the following: > > (lldb) process attach --pid 1082 > Process 1082 stopped > * thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGSTOP > frame #0: 0x00007fff5aa5c876 libsystem_kernel.dylib`__semwait_signal + > 10 > libsystem_kernel.dylib`__semwait_signal: > -> 0x7fff5aa5c876 <+10>: jae 0x7fff5aa5c880 ; <+20> > 0x7fff5aa5c878 <+12>: movq %rax, %rdi > 0x7fff5aa5c87b <+15>: jmp 0x7fff5aa58e31 ; cerror > 0x7fff5aa5c880 <+20>: retq > Target 0: (meshTest) stopped. > > Executable module set to > "/Users/maximilianhartig/cimply_playground/./meshTest". > Architecture set to: x86_64h-apple-macosx. > Okay, at the prompt you type 'bt'. This gives you the backtrace. Thanks, Matt > I do unfortunately not know what to make of this. Do I have a memory jump > that?s causing an error here? Google did not provide any useful results (at > least to my untrained eye). Using gdb I usually step through the program > and print out variables until I?ve located the error. But this does not > seem an option here. Please excuse my ignorance regarding debuggers. > > Thanks, > Max > > On 6. Nov 2018, at 11:23, Stefano Zampini > wrote: > > >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> > > Have you tried what PETSc suggests here (-on_error_attach_debugger)? > You may need to specify "-on_error_attach_debugger lldb" on a MacOS > > > -- > Stefano > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francesco_magaletti at fastwebnet.it Tue Nov 6 08:45:58 2018 From: francesco_magaletti at fastwebnet.it (Francesco Magaletti) Date: Tue, 6 Nov 2018 15:45:58 +0100 Subject: [petsc-users] need help with vector interpolation on nonuniform DMDA grids In-Reply-To: References: <27356998-760A-431D-BB28-9FA41CA0ED40@fastwebnet.it> Message-ID: <669D57CD-9870-4A79-94C9-5258442FEF4E@fastwebnet.it> Hi Matt, thanks for the reply, but I didn?t catch exactly your point. What do you mean with ?first bin the points into cells, and then use the interpolant from the discretization? ? To me it sounds like my ?naive? approach, but maybe I missed something in your suggestion. My problem comes from the solution of a 1D time evolving pde coupled with a hand-made mesh adapting algorithm. Therefore I?m already using a DMDA to manage all the communication issues among processors. Every n timesteps I move the grid and then I need to evaluate the old solution on the new grid points, this is why I need an efficient way to do it. Is it feasible to let DMDA?s objects speak with DMPlexes, in order to keep the previous code structure unaltered and use DMPlex only for the interpolation routine? Thanks > Il giorno 06/nov/2018, alle ore 15:14, Matthew Knepley ha scritto: > > On Tue, Nov 6, 2018 at 5:11 AM Francesco Magaletti via petsc-users > wrote: > Dear all, > > I would like to ask you if is there an easy and efficient way in parallel (by using PetsC functions) to interpolate a DMDA vector associated with a nonuniform 1D grid to another DMDA vector with the same length but associated with a different nonuniform grid. > > Let me rephrase it to be as clearer as I can: > > I have two structured nonuniform 1D grids with coordinate vectors x[i] and y[i]. Both the domains have been discretized with the same number of points, but the coordinate vectors x and y are different. I have a discretized field u[i] = u(x[i]) and I would like to use these point values to evaluate the values u(y[i]) in the points of the second grid. > > I read on the manual pages that functions like DMCreateInterpolation or similar work only with different but uniform DMDAs. Did I understand correctly? > > A naive approach, with a serial code, could be to find the points x[i] and x[i+1] that surround the point y[j] for every j and then simply linear interpolating the values u[i] and u[i+1]. I suspect that this is not the most efficient way to do it. Moreover it won?t work in parallel since, in principle, I do not know beforehand how many ghost nodes could be necessary to perform all the interpolations. > > Thank you in advance for your help! > > This has not been written, but is not that hard. You would first bin the points into cells, and then use the > interpolant from the discretization. This is how we do it in the unstructured case. Actually, if you wrote > your nonuniform 1D meshes as DMPlexes, I think it would work right now (have not tested). > > Thanks, > > Matt > > Francesco > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Tue Nov 6 08:46:34 2018 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Tue, 6 Nov 2018 15:46:34 +0100 Subject: [petsc-users] Correct use of PCFactorSetMatOrderingType Message-ID: Dear users, I have a question about the correct use of the option "PCFactorSetMatOrderingType" in PETSc. I am solving a problem with 2.5M of unknowns distributed along 16 processors and I am using the block jacobi preconditioner and MPIAIJ matrix format. I cannot figure out if the above option can be useful or not in decreasing the computational time. Any suggestion or tips? Thank you very much for the kind help ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica Universita' di Genova 1, via Montallegro 16145 Genova, Italy email: edoardo.alinovi at dicca.unige.it Tel: +39 010 353 2540 -------------- next part -------------- An HTML attachment was scrubbed... URL: From imilian.hartig at gmail.com Tue Nov 6 08:50:23 2018 From: imilian.hartig at gmail.com (Maximilian Hartig) Date: Tue, 6 Nov 2018 15:50:23 +0100 Subject: [petsc-users] Segmentation violation in DMPlexDistribute on MacOS In-Reply-To: References: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> <4F0544B5-C0BA-466E-AD27-018A7F045EED@gmail.com> Message-ID: > On 6. Nov 2018, at 15:42, Matthew Knepley wrote: > > On Tue, Nov 6, 2018 at 9:38 AM Maximilian Hartig via petsc-users > wrote: > lldb returns the following: > > (lldb) process attach --pid 1082 > Process 1082 stopped > * thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGSTOP > frame #0: 0x00007fff5aa5c876 libsystem_kernel.dylib`__semwait_signal + 10 > libsystem_kernel.dylib`__semwait_signal: > -> 0x7fff5aa5c876 <+10>: jae 0x7fff5aa5c880 ; <+20> > 0x7fff5aa5c878 <+12>: movq %rax, %rdi > 0x7fff5aa5c87b <+15>: jmp 0x7fff5aa58e31 ; cerror > 0x7fff5aa5c880 <+20>: retq > Target 0: (meshTest) stopped. > > Executable module set to "/Users/maximilianhartig/cimply_playground/./meshTest". > Architecture set to: x86_64h-apple-macosx. > > Okay, at the prompt you type 'bt'. This gives you the backtrace. Thanks for this hint. Would?ve had to search some time. What I get is the following: (lldb) process attach --pid 1265 Process 1265 stopped * thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGSTOP frame #0: 0x00007fff74371876 libsystem_kernel.dylib`__semwait_signal + 10 libsystem_kernel.dylib`__semwait_signal: -> 0x7fff74371876 <+10>: jae 0x7fff74371880 ; <+20> 0x7fff74371878 <+12>: movq %rax, %rdi 0x7fff7437187b <+15>: jmp 0x7fff7436de31 ; cerror 0x7fff74371880 <+20>: retq Target 0: (meshTest) stopped. Executable module set to "/Users/maximilianhartig/cimply_playground/./meshTest". Architecture set to: x86_64h-apple-macosx. (lldb) bt * thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGSTOP * frame #0: 0x00007fff74371876 libsystem_kernel.dylib`__semwait_signal + 10 frame #1: 0x00007fff742fc830 libsystem_c.dylib`nanosleep + 199 frame #2: 0x00007fff742fc692 libsystem_c.dylib`sleep + 41 frame #3: 0x000000010e969fde libpetsc.3.10.dylib`PetscSleep(s=) at psleep.c:50 frame #4: 0x000000010e9e2546 libpetsc.3.10.dylib`PetscAttachDebugger at adebug.c:390 frame #5: 0x000000010e9e31ba libpetsc.3.10.dylib`PetscAttachDebuggerErrorHandler(comm=, line=, fun=, file=, num=, p=, mess=, ctx=0x0000000000000000) at adebug.c:452 frame #6: 0x000000010e9e389f libpetsc.3.10.dylib`PetscError(comm=1140850689, line=0, func="User provided function", file=" unknown file", n=59, p=PETSC_ERROR_INITIAL, mess=0x0000000000000000) at err.c:367 frame #7: 0x000000010e9e78f0 libpetsc.3.10.dylib`PetscSignalHandlerDefault(sig=, ptr=) at signal.c:152 frame #8: 0x000000010e9e79b6 libpetsc.3.10.dylib`PetscSignalHandler_Private(sig=11) at signal.c:43 frame #9: 0x00007fff7441eb3d libsystem_platform.dylib`_sigtramp + 29 frame #10: 0x00007fff74220e97 libdyld.dylib`stack_not_16_byte_aligned_error + 1 (lldb) Thanks, Max > > Thanks, > > Matt > > I do unfortunately not know what to make of this. Do I have a memory jump that?s causing an error here? Google did not provide any useful results (at least to my untrained eye). Using gdb I usually step through the program and print out variables until I?ve located the error. But this does not seem an option here. Please excuse my ignorance regarding debuggers. > > Thanks, > Max > >> On 6. Nov 2018, at 11:23, Stefano Zampini > wrote: >> >> >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> >> Have you tried what PETSc suggests here (-on_error_attach_debugger)? >> You may need to specify "-on_error_attach_debugger lldb" on a MacOS >> >> >> -- >> Stefano > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Nov 6 08:53:44 2018 From: balay at mcs.anl.gov (Balay, Satish) Date: Tue, 6 Nov 2018 14:53:44 +0000 Subject: [petsc-users] Segmentation violation in DMPlexDistribute on MacOS In-Reply-To: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> References: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> Message-ID: > --with-cc=gcc-mp-6 --with-cxx=g++-mp-6 Is it possible to use xcode compilers? I've seen wierd errors with mpich/openmpi and gcc on OSX - don't know how reliable they are on OSX. It would be a good test to check if the same error is reproduceable with them.. Satish On Tue, 6 Nov 2018, Maximilian Hartig via petsc-users wrote: > Hello petsc team, > > I?ve run into an issue using DMPlexDistribute on Mac OS 10.14.1. Here?s my dilemma: > - Compiling with petsc in debug mode (?with-debugging=true); the problem does not occur > - On two other (linux) machines, the error does not occur > - valgrind is not available for Mac OS Mojave at this point in time and it is unclear until when it will be > - SEGV occurs with mesh from gmsh file and DMPlexCreateBoxMesh copied from https://www.mcs.anl.gov/petsc/petsc-current/src/dm/impls/plex/examples/tutorials/ex1.c.html > - I use the lastest petsc on maint/ branch from git > - petsc itself compiles and runs its tests (make [?] check) without problems on one and two processors. > > > Here?s the code I run for testing: > > #include > #include > > int main(int argc, char **args) { > PetscErrorCode ierr; > PetscPartitioner part; > DM dm, distributedDm; > const char help[] = "no help"; > > PetscInitialize(&argc,&args,(char*)0,help); > > ierr = DMPlexCreateFromFile(PETSC_COMM_WORLD,"cylinder_smallring.msh",PETSC_TRUE,&dm); CHKERRQ(ierr); > // ierr = DMPlexCreateBoxMesh(PETSC_COMM_WORLD, 2, PETSC_TRUE, NULL, NULL, NULL, NULL, PETSC_TRUE, &dm); CHKERRQ(ierr); > ierr = DMPlexGetPartitioner(dm, &part); CHKERRQ(ierr); > ierr = PetscPartitionerSetFromOptions(part); CHKERRQ(ierr); > // ierr = PetscPartitionerSetType(part, PETSCPARTITIONERPARMETIS); CHKERRQ(ierr); > ierr = DMPlexDistribute(dm, 0, NULL, &distributedDm); CHKERRQ(ierr); > if(distributedDm){ > ierr = DMDestroy(&dm); CHKERRQ(ierr); > dm = distributedDm; > } > ierr = DMDestroy(&dm); CHKERRQ(ierr); > > PetscFinalize(); > return 0; > } > > Here?s the error message: > > scc-wkit-clx-237-216:cimply_playground maximilianhartig$ ~/petsc/arch-darwin-c/bin/mpiexec -np 2 ./meshTest > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run > [0]PETSC ERROR: to get more information on the crash. > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Signal received > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./meshTest on a arch-darwin-c named Maximilians-MacBook.local by maximilianhartig Tue Nov 6 11:02:14 2018 > [0]PETSC ERROR: Configure options PETSC_ARCH=arch-darwin-c --with-debugging=false --download-mumps --download-chacoa --download-scalapack --download-mpich --with-fc=gfortran-mp-6 --download-triangle --with-cc=gcc-mp-6 --with-cxx=g++-mp-6 --download-fblaspack --download-parmetis --download-metis --download-hdf5 --download chaco > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > > > I don?t know how to proceed with this and would be thankful for a pointer in the right direction. > Thanks, > > Max > > From knepley at gmail.com Tue Nov 6 09:01:04 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 6 Nov 2018 10:01:04 -0500 Subject: [petsc-users] need help with vector interpolation on nonuniform DMDA grids In-Reply-To: <669D57CD-9870-4A79-94C9-5258442FEF4E@fastwebnet.it> References: <27356998-760A-431D-BB28-9FA41CA0ED40@fastwebnet.it> <669D57CD-9870-4A79-94C9-5258442FEF4E@fastwebnet.it> Message-ID: On Tue, Nov 6, 2018 at 9:45 AM Francesco Magaletti < francesco_magaletti at fastwebnet.it> wrote: > Hi Matt, > thanks for the reply, but I didn?t catch exactly your point. What do you > mean with ?first bin the points into cells, and then use the > interpolant from the discretization? ? To me it sounds like my ?naive? > approach, but maybe I missed something in your suggestion. > I did not read the whole mail before. Your naive approach is exactly right. However, in parallel, you need to keep track of the bounds of each process, send the points to the right process, and then do local location and interpolation. It is some bookkeeping. Explaining this gave me another idea. I think that DMSwarm might do this automatically. You create a DMSwarm, stick in all the points you need to interpolate at, set the cellDM to the DMDA used for interpolation. Then you call DMSwarmMigrate() to put the points on the correct process, and then interpolate. Tell me if this does not work. I am Cc'ing the author Dave May in case I have made an error. Thanks, Matt > My problem comes from the solution of a 1D time evolving pde coupled with > a hand-made mesh adapting algorithm. Therefore I?m already using a DMDA to > manage all the communication issues among processors. Every n timesteps I > move the grid and then I need to evaluate the old solution on the new grid > points, this is why I need an efficient way to do it. Is it feasible to let > DMDA?s objects speak with DMPlexes, in order to keep the previous code > structure unaltered and use DMPlex only for the interpolation routine? > > Thanks > > > > Il giorno 06/nov/2018, alle ore 15:14, Matthew Knepley > ha scritto: > > On Tue, Nov 6, 2018 at 5:11 AM Francesco Magaletti via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Dear all, >> >> I would like to ask you if is there an easy and efficient way in parallel >> (by using PetsC functions) to interpolate a DMDA vector associated with a >> nonuniform 1D grid to another DMDA vector with the same length but >> associated with a different nonuniform grid. >> >> Let me rephrase it to be as clearer as I can: >> >> I have two structured nonuniform 1D grids with coordinate vectors x[i] >> and y[i]. Both the domains have been discretized with the same number of >> points, but the coordinate vectors x and y are different. I have a >> discretized field u[i] = u(x[i]) and I would like to use these point values >> to evaluate the values u(y[i]) in the points of the second grid. >> >> I read on the manual pages that functions like DMCreateInterpolation or >> similar work only with different but uniform DMDAs. Did I understand >> correctly? >> >> A naive approach, with a serial code, could be to find the points x[i] >> and x[i+1] that surround the point y[j] for every j and then simply linear >> interpolating the values u[i] and u[i+1]. I suspect that this is not the >> most efficient way to do it. Moreover it won?t work in parallel since, in >> principle, I do not know beforehand how many ghost nodes could be necessary >> to perform all the interpolations. >> >> Thank you in advance for your help! >> > > This has not been written, but is not that hard. You would first bin the > points into cells, and then use the > interpolant from the discretization. This is how we do it in the > unstructured case. Actually, if you wrote > your nonuniform 1D meshes as DMPlexes, I think it would work right now > (have not tested). > > Thanks, > > Matt > > >> Francesco > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Tue Nov 6 09:04:57 2018 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Tue, 6 Nov 2018 15:04:57 +0000 Subject: [petsc-users] Correct use of PCFactorSetMatOrderingType In-Reply-To: References: Message-ID: Edoardo: You can test runtime option '-sub_pc_factor_mat_ordering_type' and use '-log_view' to get performance on different orderings, e.g.,petsc/src/ksp/ksp/examples/tutorials/ex2.c: mpiexec -n 2 ./ex2 -ksp_view -sub_pc_type ilu -sub_pc_factor_mat_ordering_type nd I do not think the ordering inside block for ilu would affect performance much. Let us know what you will get. Hong Dear users, I have a question about the correct use of the option "PCFactorSetMatOrderingType" in PETSc. I am solving a problem with 2.5M of unknowns distributed along 16 processors and I am using the block jacobi preconditioner and MPIAIJ matrix format. I cannot figure out if the above option can be useful or not in decreasing the computational time. Any suggestion or tips? Thank you very much for the kind help ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica Universita' di Genova 1, via Montallegro 16145 Genova, Italy email: edoardo.alinovi at dicca.unige.it Tel: +39 010 353 2540 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Nov 6 09:03:48 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 6 Nov 2018 10:03:48 -0500 Subject: [petsc-users] Correct use of PCFactorSetMatOrderingType In-Reply-To: References: Message-ID: On Tue, Nov 6, 2018 at 9:47 AM Edoardo alinovi via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear users, > > I have a question about the correct use of the option > "PCFactorSetMatOrderingType" in PETSc. > > I am solving a problem with 2.5M of unknowns distributed along 16 > processors and I am using the block jacobi preconditioner and MPIAIJ > matrix format. I cannot figure out if the above option can be useful or not > in decreasing the computational time. Any suggestion or tips? > It depends on where the matrix comes from. If the initial ordering is bad, it can make a big difference. If not, then it won't. Thanks, Matt > Thank you very much for the kind help > > ------ > > Edoardo Alinovi, Ph.D. > > DICCA, Scuola Politecnica > Universita' di Genova > 1, via Montallegro > 16145 Genova, Italy > > email: edoardo.alinovi at dicca.unige.it > Tel: +39 010 353 2540 > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Tue Nov 6 09:28:46 2018 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Tue, 6 Nov 2018 16:28:46 +0100 Subject: [petsc-users] Correct use of PCFactorSetMatOrderingType In-Reply-To: References: Message-ID: Dear Hong and Matt, thank you for your kind replay. I have just tested your suggestions and applied " -sub_pc_type ilu -sub_pc_factor_mat_ordering_type nd/rcm" and, in both cases, I have found a deterioration of performances with respect to doing nothing (thus just putting default PCBJACOBI). Is it normal? However, I guess this is very problem dependent. ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica Universita' di Genova 1, via Montallegro 16145 Genova, Italy email: edoardo.alinovi at dicca.unige.it Tel: +39 010 353 2540 Il giorno mar 6 nov 2018 alle ore 16:04 Zhang, Hong ha scritto: > Edoardo: > You can test runtime option '-sub_pc_factor_mat_ordering_type' and use > '-log_view' to get performance on different orderings, > e.g.,petsc/src/ksp/ksp/examples/tutorials/ex2.c: > mpiexec -n 2 ./ex2 -ksp_view -sub_pc_type ilu > -sub_pc_factor_mat_ordering_type nd > > I do not think the ordering inside block for ilu would affect performance > much. Let us know what you will get. > Hong > > Dear users, >> >> I have a question about the correct use of the option >> "PCFactorSetMatOrderingType" in PETSc. >> >> I am solving a problem with 2.5M of unknowns distributed along 16 >> processors and I am using the block jacobi preconditioner and MPIAIJ >> matrix format. I cannot figure out if the above option can be useful or not >> in decreasing the computational time. Any suggestion or tips? >> >> Thank you very much for the kind help >> >> ------ >> >> Edoardo Alinovi, Ph.D. >> >> DICCA, Scuola Politecnica >> Universita' di Genova >> 1, via Montallegro >> 16145 Genova, Italy >> >> email: edoardo.alinovi at dicca.unige.it >> Tel: +39 010 353 2540 >> >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Tue Nov 6 10:15:20 2018 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Tue, 6 Nov 2018 16:15:20 +0000 Subject: [petsc-users] Correct use of PCFactorSetMatOrderingType In-Reply-To: References: Message-ID: Edoardo: Interesting. I thought it would not affect performance much. What happens if you use -sub_pc_type lu'? Hong Dear Hong and Matt, thank you for your kind replay. I have just tested your suggestions and applied " -sub_pc_type ilu -sub_pc_factor_mat_ordering_type nd/rcm" and, in both cases, I have found a deterioration of performances with respect to doing nothing (thus just putting default PCBJACOBI). Is it normal? However, I guess this is very problem dependent. ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica Universita' di Genova 1, via Montallegro 16145 Genova, Italy email: edoardo.alinovi at dicca.unige.it Tel: +39 010 353 2540 Il giorno mar 6 nov 2018 alle ore 16:04 Zhang, Hong > ha scritto: Edoardo: You can test runtime option '-sub_pc_factor_mat_ordering_type' and use '-log_view' to get performance on different orderings, e.g.,petsc/src/ksp/ksp/examples/tutorials/ex2.c: mpiexec -n 2 ./ex2 -ksp_view -sub_pc_type ilu -sub_pc_factor_mat_ordering_type nd I do not think the ordering inside block for ilu would affect performance much. Let us know what you will get. Hong Dear users, I have a question about the correct use of the option "PCFactorSetMatOrderingType" in PETSc. I am solving a problem with 2.5M of unknowns distributed along 16 processors and I am using the block jacobi preconditioner and MPIAIJ matrix format. I cannot figure out if the above option can be useful or not in decreasing the computational time. Any suggestion or tips? Thank you very much for the kind help ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica Universita' di Genova 1, via Montallegro 16145 Genova, Italy email: edoardo.alinovi at dicca.unige.it Tel: +39 010 353 2540 -------------- next part -------------- An HTML attachment was scrubbed... URL: From imilian.hartig at gmail.com Tue Nov 6 12:02:56 2018 From: imilian.hartig at gmail.com (Maximilian Hartig) Date: Tue, 6 Nov 2018 19:02:56 +0100 Subject: [petsc-users] Segmentation violation in DMPlexDistribute on MacOS In-Reply-To: References: <7F9D2B7C-9C21-40AF-A4E5-7A1C844B0F30@gmail.com> Message-ID: <4726571E-24BB-4D03-B0F6-3C11AC391B70@gmail.com> > On 6. Nov 2018, at 15:53, Balay, Satish wrote: > >> --with-cc=gcc-mp-6 --with-cxx=g++-mp-6 > > Is it possible to use xcode compilers? I've seen wierd errors with mpich/openmpi and gcc on OSX - don't know how reliable they are on OSX. Yes, using clang (Xcode) with gfortran from macports did indeed resolve the issue. Thank you. Best, Max > > It would be a good test to check if the same error is reproduceable with them.. > > Satish > > On Tue, 6 Nov 2018, Maximilian Hartig via petsc-users wrote: > >> Hello petsc team, >> >> I?ve run into an issue using DMPlexDistribute on Mac OS 10.14.1. Here?s my dilemma: >> - Compiling with petsc in debug mode (?with-debugging=true); the problem does not occur >> - On two other (linux) machines, the error does not occur >> - valgrind is not available for Mac OS Mojave at this point in time and it is unclear until when it will be >> - SEGV occurs with mesh from gmsh file and DMPlexCreateBoxMesh copied from https://www.mcs.anl.gov/petsc/petsc-current/src/dm/impls/plex/examples/tutorials/ex1.c.html >> - I use the lastest petsc on maint/ branch from git >> - petsc itself compiles and runs its tests (make [?] check) without problems on one and two processors. >> >> >> Here?s the code I run for testing: >> >> #include >> #include >> >> int main(int argc, char **args) { >> PetscErrorCode ierr; >> PetscPartitioner part; >> DM dm, distributedDm; >> const char help[] = "no help"; >> >> PetscInitialize(&argc,&args,(char*)0,help); >> >> ierr = DMPlexCreateFromFile(PETSC_COMM_WORLD,"cylinder_smallring.msh",PETSC_TRUE,&dm); CHKERRQ(ierr); >> // ierr = DMPlexCreateBoxMesh(PETSC_COMM_WORLD, 2, PETSC_TRUE, NULL, NULL, NULL, NULL, PETSC_TRUE, &dm); CHKERRQ(ierr); >> ierr = DMPlexGetPartitioner(dm, &part); CHKERRQ(ierr); >> ierr = PetscPartitionerSetFromOptions(part); CHKERRQ(ierr); >> // ierr = PetscPartitionerSetType(part, PETSCPARTITIONERPARMETIS); CHKERRQ(ierr); >> ierr = DMPlexDistribute(dm, 0, NULL, &distributedDm); CHKERRQ(ierr); >> if(distributedDm){ >> ierr = DMDestroy(&dm); CHKERRQ(ierr); >> dm = distributedDm; >> } >> ierr = DMDestroy(&dm); CHKERRQ(ierr); >> >> PetscFinalize(); >> return 0; >> } >> >> Here?s the error message: >> >> scc-wkit-clx-237-216:cimply_playground maximilianhartig$ ~/petsc/arch-darwin-c/bin/mpiexec -np 2 ./meshTest >> [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors >> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run >> [0]PETSC ERROR: to get more information on the crash. >> [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- >> [0]PETSC ERROR: Signal received >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown >> [0]PETSC ERROR: ./meshTest on a arch-darwin-c named Maximilians-MacBook.local by maximilianhartig Tue Nov 6 11:02:14 2018 >> [0]PETSC ERROR: Configure options PETSC_ARCH=arch-darwin-c --with-debugging=false --download-mumps --download-chacoa --download-scalapack --download-mpich --with-fc=gfortran-mp-6 --download-triangle --with-cc=gcc-mp-6 --with-cxx=g++-mp-6 --download-fblaspack --download-parmetis --download-metis --download-hdf5 --download chaco >> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file >> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >> >> >> I don?t know how to proceed with this and would be thankful for a pointer in the right direction. >> Thanks, >> >> Max >> >> From eldred at sensuoussam.com Tue Nov 6 13:54:49 2018 From: eldred at sensuoussam.com (Jeanne) Date: Tue, 6 Nov 2018 11:54:49 -0800 Subject: [petsc-users] please be my f@ck buddy Message-ID: <8C7650A35084AA09D5D0992400D683D8@sensuoussam.com> http://cdfijlor.info/?cdfijlor cdfijlor https://cjkpruyz.biz/?cjkpruyz cjkpruyz https://cemty.net/?cemty cemty https://ceghksuy.info/?ceghksuy ceghksuy http://fhlpt.com/?fhlpt fhlpt http://fkmpqt.biz/?fkmpqt fkmpqt https://jkrvz.net/?jkrvz jkrvz https://ajmnostw.biz/?ajmnostw ajmnostw https://bfgjkmo.info/?bfgjkmo bfgjkmo http://cekrsuz.net/?cekrsuz cekrsuz http://cikopsu.info/?cikopsu cikopsu http://filmtvyz.com/?filmtvyz filmtvyz https://alnoqrt.net/?alnoqrt alnoqrt https://dfqtw.ru/?dfqtw dfqtw https://ceijnu.ru/?ceijnu ceijnu https://belmpr.info/?belmpr belmpr https://bckloruy.info/?bckloruy bckloruy https://diktwy.ua/?diktwy diktwy https://kqswx.ru/?kqswx kqswx https://gijlvz.net/?gijlvz gijlvz https://aehkst.biz/?aehkst aehkst http://cdelmn.info/?cdelmn cdelmn http://jpwxz.ua/?jpwxz jpwxz https://cdjlptxz.ru/?cdjlptxz cdjlptxz https://dejlqry.com/?dejlqry dejlqry https://abhtx.info/?abhtx abhtx https://degjmprv.ua/?degjmprv degjmprv https://dehoqw.net/?dehoqw dehoqw https://cdeovwx.biz/?cdeovwx cdeovwx http://dfgmnqrt.com/?dfgmnqrt dfgmnqrt http://aghimouw.ru/?aghimouw aghimouw https://fhjopqt.com/?fhjopqt fhjopqt https://chimwy.biz/?chimwy chimwy http://ahmpq.com/?ahmpq ahmpq http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cdfijlor.info/?cdfijlor cdfijlor https://cjkpruyz.biz/?cjkpruyz cjkpruyz https://cemty.net/?cemty cemty https://ceghksuy.info/?ceghksuy ceghksuy http://fhlpt.com/?fhlpt fhlpt http://fkmpqt.biz/?fkmpqt fkmpqt https://jkrvz.net/?jkrvz jkrvz https://ajmnostw.biz/?ajmnostw ajmnostw https://bfgjkmo.info/?bfgjkmo bfgjkmo http://cekrsuz.net/?cekrsuz cekrsuz http://cikopsu.info/?cikopsu cikopsu http://filmtvyz.com/?filmtvyz filmtvyz https://alnoqrt.net/?alnoqrt alnoqrt https://dfqtw.ru/?dfqtw dfqtw https://ceijnu.ru/?ceijnu ceijnu https://belmpr.info/?belmpr belmpr https://bckloruy.info/?bckloruy bckloruy https://diktwy.ua/?diktwy diktwy https://kqswx.ru/?kqswx kqswx https://gijlvz.net/?gijlvz gijlvz https://aehkst.biz/?aehkst aehkst http://cdelmn.info/?cdelmn cdelmn http://jpwxz.ua/?jpwxz jpwxz https://cdjlptxz.ru/?cdjlptxz cdjlptxz https://dejlqry.com/?dejlqry dejlqry https://abhtx.info/?abhtx abhtx https://degjmprv.ua/?degjmprv degjmprv https://dehoqw.net/?dehoqw dehoqw https://cdeovwx.biz/?cdeovwx cdeovwx http://dfgmnqrt.com/?dfgmnqrt dfgmnqrt http://aghimouw.ru/?aghimouw aghimouw https://fhjopqt.com/?fhjopqt fhjopqt https://chimwy.biz/?chimwy chimwy http://ahmpq.com/?ahmpq ahmpq http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx Hey pretty. I?ve seen, found your picture and liked your appearence sooo much. I can not stop thinking of you . I wish to know you better. Find me on that website -------------- next part -------------- An HTML attachment was scrubbed... URL: From mbuerkle at web.de Tue Nov 6 23:37:16 2018 From: mbuerkle at web.de (Marius Buerkle) Date: Wed, 7 Nov 2018 06:37:16 +0100 Subject: [petsc-users] PetscBarrier from fortran Message-ID: An HTML attachment was scrubbed... URL: From yjwu16 at gmail.com Wed Nov 7 09:15:28 2018 From: yjwu16 at gmail.com (Yingjie Wu) Date: Wed, 7 Nov 2018 23:15:28 +0800 Subject: [petsc-users] Problems about PCtype bjacobi Message-ID: Dear Petsc developer: Hi, Recently, I'm solving the problems of nonlinear systems of PDEs, I encountered some problems about precondition and wanted to seek help. 1.I set the precondition matrix in SNES as MPIAIJ in the program, and then use Matrix Free method to solve my problem. The log information of the program is as follows: SNES Object: 1 MPI processes type: newtonls maximum iterations=50, maximum function evaluations=100000000 tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 total number of linear solver iterations=177 total number of function evaluations=371 norm schedule ALWAYS SNESLineSearch Object: 1 MPI processes type: bt interpolation: cubic alpha=1.000000e-04 maxstep=1.000000e+08, minlambda=1.000000e-12 tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08 maximum iterations=40 KSP Object: 1 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=0.01, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: bjacobi number of blocks = 1 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (sub_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (sub_) 1 MPI processes type: bjacobi number of blocks = 1 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (sub_sub_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (sub_sub_) 1 MPI processes type: ilu out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=961, cols=961 package used to perform factorization: petsc total: nonzeros=4129, allocated nonzeros=4129 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=961, cols=961 total: nonzeros=4129, allocated nonzeros=4805 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: 1 MPI processes type: mpiaij rows=961, cols=961 total: nonzeros=4129, allocated nonzeros=9610 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines linear system matrix followed by preconditioner matrix: Mat Object: 1 MPI processes type: mffd rows=961, cols=961 Matrix-free approximation: err=1.49012e-08 (relative error in function evaluation) Using wp compute h routine Does not compute normU Mat Object: 1 MPI processes type: mpiaij rows=961, cols=961 total: nonzeros=4129, allocated nonzeros=9610 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Although parallel matrix is used, it runs on a single processor. Because of the use of parallel matrices, the overall precondition scheme should be bjacobi, and then build a KSP for each block (there is only one block in my program). Therefore, there will be a sub KSP object in the PC information. But in the above information, a subsubksp is also embedded in the sub KSP object. I don't understand the reason for this KSP. Please help me answer. 2. Bjacobi is a precondition method in theory. Why is there a subsystem of linear equations solver object ksp? In my understanding, precondition is a matrix decomposition method, and there is no linear equations?. Therefore, there should be no subKSP here. 3.when I assemble precondition matrix, I used MatSetVales. idleft = row - 1 ; idright = row + 1 ; idup = row + 20; iddown = row - 20; /*phi1 field*/ v[1] = - 1.0 / ( dx * dx / (2* 1.267) + dx * dx / (2* 1.267 ) ) ; col[1] = idleft; v[2] = - 1.0 / ( dx * dx / (2* 1.267) + dx * dx / (2* 1.267 ) ) ; col[2] = idright; v[3] = - 1.0/ ( dy * dy / (2* 1.267) + dy * dy / (2* 1.267 ) ) ; col[3] = iddown; v[4] = - 1.0 / ( dy * dy / (2* 1.267) + dy * dy / (2* 1.267 ) ) ; col[4] = idup; v[0] = v[1] + v[2] + v[3] + v[4] + v[0] ; col[0] = row; ierr = MatSetValues(B,1,&row,5,col,v,INSERT_VALUES);CHKERRQ(ierr); The wrong information is: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Argument out of range [0]PETSC ERROR: Column too large: col 1077215232 max 960 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.10.1, Sep, 26, 2018 [0]PETSC ERROR: ./ex217 on a arch-linux2-c-debug named yjwu-XPS-8910 by yjwu Wed Nov 7 09:52:22 2018 [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack [0]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 442 in /home/yjwu/petsc-3.10.1/src/mat/impls/aij/seq/aij.c [0]PETSC ERROR: #2 MatSetValues() line 1349 in /home/yjwu/petsc-3.10.1/src/mat/interface/matrix.c [0]PETSC ERROR: #3 FormJacobian() line 272 in /home/yjwu/petsc-3.10.1/src/snes/examples/tutorials/ex217.c [0]PETSC ERROR: #4 SNESComputeJacobian() line 2555 in /home/yjwu/petsc-3.10.1/src/snes/interface/snes.c [0]PETSC ERROR: #5 SNESSolve_NEWTONLS() line 222 in /home/yjwu/petsc-3.10.1/src/snes/impls/ls/ls.c [0]PETSC ERROR: #6 SNESSolve() line 4396 in /home/yjwu/petsc-3.10.1/src/snes/interface/snes.c [0]PETSC ERROR: #7 main() line 108 in /home/yjwu/petsc-3.10.1/src/snes/examples/tutorials/ex217.c This seems to be a very obvious question, but I don't know how to solve it. Thanks for your continuous help, Yingjie -------------- next part -------------- An HTML attachment was scrubbed... URL: From amartin at cimne.upc.edu Wed Nov 7 10:02:15 2018 From: amartin at cimne.upc.edu (=?UTF-8?B?IkFsYmVydG8gRi4gTWFydMOtbiI=?=) Date: Wed, 07 Nov 2018 17:02:15 +0100 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue Message-ID: <5BE30C87.10204@cimne.upc.edu> Dear All, we are performing a weak scaling test of the PETSc (v3.9.0) GAMG preconditioner when applied to the linear system arising from the *conforming unfitted FE discretization *(using Q1 Lagrangian FEs) of a 3D PDE Poisson problem, where the boundary of the domain (a popcorn flake) is described as a zero-level-set embedded within a uniform background (Cartesian-like) hexahedral mesh. Details underlying the FEM formulation can be made available on demand if you believe that this might be helpful, but let me just point out that it is designed such that it addresses the well-known ill-conditioning issues of unfitted FE discretizations due to the small cut cell problem. The weak scaling test is set up as follows. We start from a single cube background mesh, and refine it uniformly several steps, until we have approximately either 10**3 (load1), 20**3 (load2), or 40**3 (load3) hexahedra/MPI task when distributing it over 4 MPI tasks. The benchmark is scaled such that the next larger scale problem to be tested is obtained by uniformly refining the mesh from the previous scale and running it on 8x times the number of MPI tasks that we used in the previous scale. As a result, we obtain three weak scaling curves for each of the three fixed loads per MPI task above, on the following total number of MPI tasks: 4, 32, 262, 2097, 16777. The underlying mesh is not partitioned among MPI tasks using ParMETIS (unstructured multilevel graph partitioning) nor optimally by hand, but following the so-called z-shape space-filling curves provided by an underlying octree-like mesh handler (i.e., p4est library). I configured the preconditioned linear solver as follows: -ksp_type cg -ksp_monitor -ksp_rtol 1.0e-6 -ksp_converged_reason -ksp_max_it 500 -ksp_norm_type unpreconditioned -ksp_view -log_view -pc_type gamg -pc_gamg_type agg -mg_levels_esteig_ksp_type cg -mg_coarse_sub_pc_type cholesky -mg_coarse_sub_pc_factor_mat_ordering_type nd -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 0 -pc_gamg_agg_nsmooths 1 Raw timings (in seconds) of the preconditioner set up and PCG iterative solution stage, and number of iterations are as follows: **preconditioner set up** (load1): [0.02542160451, 0.05169247743, 0.09266782179, 0.2426272957, 13.64161944] (load2): [0.1239175797 , 0.1885528499 , 0.2719282564 , 0.4783878336, 13.37947339] (load3): [0.6565349903 , 0.9435049873 , 1.299908397 , 1.916243652 , 16.02904088] **PCG stage** (load1): [0.003287350759, 0.008163803257, 0.03565631993, 0.08343045413, 0.6937994603] (load2): [0.0205939794 , 0.03594723623 , 0.07593298424, 0.1212046621 , 0.6780373845] (load3): [0.1310882876 , 0.3214917686 , 0.5532023879 , 0.766881627 , 1.485446003] **number of PCG iterations** (load1): [5, 8, 11, 13, 13] (load2): [7, 10, 12, 13, 13] (load3): [8, 10, 12, 13, 13] It can be observed that both the number of linear solver iterations and the PCG stage timings (weakly) scale remarkably, but t*here is a significant time increase when scaling the problem from 2097 to 16777 MPI tasks ** **for the preconditioner setup stage* (e.g., 1.916243652 vs 16.02904088 sec. with 40**3 cells per MPI task). I gathered the combined output of -ksp_view and -log_view (only) for all the points involving the load3 weak scaling test (find them attached to this message). Please note that within each run, I execute the these two stages up-to three times, and this influences absolute timings given in -log_view. Looking at the output of -log_view, it is very strange to me, e.g., that the stage labelled as "Graph" does not scale properly as it is just a call to MatDuplicate if the block size of the matrix is 1 (our case), and I guess that it is just a local operation that does not require any communication. What I am missing here? The load does not seem to be unbalanced looking at the "Ratio" column. I wonder whether the observed behaviour is as expected, or this a miss-configuration of the solver from our side. I played (quite a lot) with several parameter-value combinations, and the configuration above is the one that led to fastest execution (from the ones tested, that might be incomplete, I can also provide further feedback if helpful). Any feedback that we can get from your experience in order to find the cause(s) of this issue and a mitigating solution will be of high added value. Thanks very much in advance! Best regards, Alberto. -- Alberto F. Mart?n-Huertas Senior Researcher, PhD. Computational Science Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) Parc Mediterrani de la Tecnologia, UPC Esteve Terradas 5, Building C3, Office 215, 08860 Castelldefels (Barcelona, Spain) Tel.: (+34) 9341 34223 e-mail:amartin at cimne.upc.edu FEMPAR project co-founder web: http://www.fempar.org ________________ IMPORTANT NOTICE All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- KSP Object: 4 MPI processes type: cg maximum iterations=500, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: gamg type is MULTIPLICATIVE, levels=4 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 4 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 4 MPI processes type: bjacobi number of blocks = 4 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=6, cols=6 package used to perform factorization: petsc total: nonzeros=21, allocated nonzeros=21 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=6, cols=6 total: nonzeros=36, allocated nonzeros=36 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=6, cols=6 total: nonzeros=36, allocated nonzeros=36 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 4 MPI processes type: chebyshev eigenvalue estimates used: min = 0.129951, max = 1.42946 eigenvalues estimate via cg min 0.51315, max 1.29951 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 4 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=201, cols=201 total: nonzeros=24313, allocated nonzeros=24313 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 4 MPI processes type: chebyshev eigenvalue estimates used: min = 0.132036, max = 1.4524 eigenvalues estimate via cg min 0.0839922, max 1.32036 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 4 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=4621, cols=4621 total: nonzeros=369149, allocated nonzeros=369149 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 4 MPI processes type: chebyshev eigenvalue estimates used: min = 0.167146, max = 1.83861 eigenvalues estimate via cg min 0.0634859, max 1.67146 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 4 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 4 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=63511, cols=63511 total: nonzeros=2301395, allocated nonzeros=38106600 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=63511, cols=63511 total: nonzeros=2301395, allocated nonzeros=38106600 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /gpfs/scratch/upc26/upc26229/build_rel_fempar_cell_agg_ompi/FEMPAR/bin/par_test_h_adaptive_poisson_unfitted on a arch-linux2-c-opt named s14r2b46 with 4 processors, by upc26229 Wed Nov 7 01:07:35 2018 Using Petsc Release Version 3.9.0, Apr, 07, 2018 Max Max/Min Avg Total Time (sec): 1.076e+02 1.00000 1.076e+02 Objects: 9.890e+02 1.00304 9.868e+02 Flop: 6.620e+08 1.09228 6.334e+08 2.533e+09 Flop/sec: 6.150e+06 1.09228 5.884e+06 2.353e+07 MPI Messages: 3.141e+03 1.04997 3.054e+03 1.222e+04 MPI Message Lengths: 1.331e+07 1.02147 4.319e+03 5.277e+07 MPI Reductions: 1.427e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.0765e+02 100.0% 2.5334e+09 100.0% 1.222e+04 100.0% 4.319e+03 100.0% 1.414e+03 99.1% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSided 9 1.0 1.0124e-03 2.9 0.00e+00 0.0 5.4e+01 8.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSidedF 75 1.0 1.8647e-0111.1 0.00e+00 0.0 3.2e+02 5.0e+04 0.0e+00 0 0 3 30 0 0 0 3 30 0 0 VecMDot 90 1.0 4.3525e-03 1.9 5.94e+06 1.1 0.0e+00 0.0e+00 9.0e+01 0 1 0 0 6 0 1 0 0 6 5180 VecTDot 237 1.0 1.6390e-02 3.5 3.88e+06 1.1 0.0e+00 0.0e+00 2.4e+02 0 1 0 0 17 0 1 0 0 17 897 VecNorm 225 1.0 7.4959e-03 2.6 3.28e+06 1.1 0.0e+00 0.0e+00 2.2e+02 0 0 0 0 16 0 0 0 0 16 1661 VecScale 99 1.0 2.6664e-04 1.2 5.94e+05 1.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 8457 VecCopy 105 1.0 5.5974e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 402 1.0 5.1273e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 237 1.0 1.1864e-03 1.1 3.88e+06 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 12396 VecAYPX 678 1.0 3.3991e-03 1.2 6.00e+06 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 6695 VecAXPBYCZ 288 1.0 2.0727e-03 1.1 8.64e+06 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 15825 VecMAXPY 99 1.0 1.9828e-03 1.3 7.02e+06 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 13441 VecAssemblyBegin 24 1.0 3.5199e-03 1.1 0.00e+00 0.0 6.0e+01 3.6e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 24 1.0 1.2138e-04 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 99 1.0 6.6968e-04 1.1 5.94e+05 1.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3367 VecScatterBegin 810 1.0 5.6504e-03 1.0 0.00e+00 0.0 9.3e+03 2.7e+03 0.0e+00 0 0 76 47 0 0 0 76 47 0 0 VecScatterEnd 810 1.0 1.4432e-02 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSetRandom 9 1.0 1.5864e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 99 1.0 1.6803e-03 1.7 1.78e+06 1.1 0.0e+00 0.0e+00 9.9e+01 0 0 0 0 7 0 0 0 0 7 4026 MatMult 636 1.0 1.9035e-01 1.0 3.12e+08 1.1 7.6e+03 3.0e+03 0.0e+00 0 47 62 44 0 0 47 62 44 0 6275 MatMultAdd 72 1.0 1.0663e-02 1.1 6.23e+06 1.1 6.5e+02 5.0e+02 0.0e+00 0 1 5 1 0 0 1 5 1 0 2248 MatMultTranspose 72 1.0 1.3565e-02 1.3 6.23e+06 1.1 6.5e+02 5.0e+02 0.0e+00 0 1 5 1 0 0 1 5 1 0 1767 MatSolve 24 0.0 4.1796e-05 0.0 1.58e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 38 MatSOR 531 1.0 2.2511e-01 1.1 2.33e+08 1.1 0.0e+00 0.0e+00 0.0e+00 0 35 0 0 0 0 35 0 0 0 3950 MatCholFctrSym 3 1.0 4.2330e-05 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 3 1.0 4.0101e-05 1.8 1.80e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 9 1.0 1.6046e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 27 1.0 5.5906e-03 1.1 5.00e+06 1.1 1.1e+02 2.9e+03 0.0e+00 0 1 1 1 0 0 1 1 1 0 3428 MatResidual 72 1.0 2.0435e-02 1.1 3.38e+07 1.1 8.6e+02 2.9e+03 0.0e+00 0 5 7 5 0 0 5 7 5 0 6330 MatAssemblyBegin 168 1.0 2.3268e-01 1.4 0.00e+00 0.0 2.6e+02 6.1e+04 0.0e+00 0 0 2 30 0 0 0 2 30 0 0 MatAssemblyEnd 168 1.0 1.3291e-01 1.1 0.00e+00 0.0 7.8e+02 8.4e+02 3.6e+02 0 0 6 1 25 0 0 6 1 25 0 MatGetRow 162018 1.1 1.9420e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 3 0.0 4.2650e-05 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 6 1.0 1.6496e-03 1.0 0.00e+00 0.0 4.8e+01 5.1e+01 9.6e+01 0 0 0 0 7 0 0 0 0 7 0 MatGetOrdering 3 0.0 2.7078e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 9 1.0 7.8692e-03 1.1 0.00e+00 0.0 7.0e+02 2.2e+03 2.7e+01 0 0 6 3 2 0 0 6 3 2 0 MatZeroEntries 9 1.0 1.0505e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 21 1.4 3.8399e-03 3.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.5e+01 0 0 0 0 1 0 0 0 0 1 0 MatAXPY 9 1.0 1.6539e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 9 1.0 7.2559e-02 1.0 4.22e+06 1.1 6.3e+02 2.1e+03 1.1e+02 0 1 5 2 8 0 1 5 2 8 223 MatMatMultSym 9 1.0 5.8217e-02 1.0 0.00e+00 0.0 5.2e+02 1.9e+03 1.1e+02 0 0 4 2 8 0 0 4 2 8 0 MatMatMultNum 9 1.0 1.4338e-02 1.0 4.22e+06 1.1 1.1e+02 2.9e+03 0.0e+00 0 1 1 1 0 0 1 1 1 0 1128 MatPtAP 9 1.0 3.5600e-01 1.0 5.53e+07 1.1 9.5e+02 1.7e+04 1.4e+02 0 9 8 30 9 0 9 8 30 10 605 MatPtAPSymbolic 9 1.0 2.5158e-01 1.0 0.00e+00 0.0 6.2e+02 1.4e+04 6.3e+01 0 0 5 16 4 0 0 5 16 4 0 MatPtAPNumeric 9 1.0 1.0436e-01 1.0 5.53e+07 1.1 3.3e+02 2.2e+04 7.2e+01 0 9 3 14 5 0 9 3 14 5 2064 MatGetLocalMat 27 1.0 1.0135e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 27 1.0 7.4055e-03 1.2 0.00e+00 0.0 7.6e+02 9.9e+03 0.0e+00 0 0 6 14 0 0 0 6 14 0 0 KSPGMRESOrthog 90 1.0 5.7355e-03 1.4 1.19e+07 1.1 0.0e+00 0.0e+00 9.0e+01 0 2 0 0 6 0 2 0 0 6 7863 KSPSetUp 36 1.0 3.4705e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 2 0 0 0 0 2 0 KSPSolve 3 1.0 4.3135e-01 1.0 5.40e+08 1.1 7.9e+03 2.6e+03 3.6e+02 0 81 64 39 25 0 81 64 39 26 4786 PCGAMGGraph_AGG 9 1.0 2.2263e-01 1.0 4.22e+06 1.1 3.2e+02 1.9e+03 1.1e+02 0 1 3 1 8 0 1 3 1 8 73 PCGAMGCoarse_AGG 9 1.0 1.1966e-02 1.0 0.00e+00 0.0 7.0e+02 2.2e+03 2.7e+01 0 0 6 3 2 0 0 6 3 2 0 PCGAMGProl_AGG 9 1.0 4.8133e-02 1.0 0.00e+00 0.0 3.8e+02 1.7e+03 1.4e+02 0 0 3 1 10 0 0 3 1 10 0 PCGAMGPOpt_AGG 9 1.0 1.4798e-01 1.0 6.22e+07 1.1 1.7e+03 2.6e+03 3.7e+02 0 9 14 8 26 0 9 14 8 26 1605 GAMG: createProl 9 1.0 4.3246e-01 1.0 6.64e+07 1.1 3.1e+03 2.3e+03 6.5e+02 0 10 26 14 45 0 10 26 14 46 586 Graph 18 1.0 2.2127e-01 1.0 4.22e+06 1.1 3.2e+02 1.9e+03 1.1e+02 0 1 3 1 8 0 1 3 1 8 73 MIS/Agg 9 1.0 7.9907e-03 1.1 0.00e+00 0.0 7.0e+02 2.2e+03 2.7e+01 0 0 6 3 2 0 0 6 3 2 0 SA: col data 9 1.0 5.8624e-03 1.1 0.00e+00 0.0 2.2e+02 2.7e+03 3.6e+01 0 0 2 1 3 0 0 2 1 3 0 SA: frmProl0 9 1.0 4.0594e-02 1.0 0.00e+00 0.0 1.7e+02 4.9e+02 7.2e+01 0 0 1 0 5 0 0 1 0 5 0 SA: smooth 9 1.0 9.3119e-02 1.0 5.00e+06 1.1 6.3e+02 2.1e+03 1.3e+02 0 1 5 2 9 0 1 5 2 9 206 GAMG: partLevel 9 1.0 3.5860e-01 1.0 5.53e+07 1.1 1.0e+03 1.6e+04 2.9e+02 0 9 8 30 20 0 9 8 30 20 601 repartition 3 1.0 1.7322e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 Invert-Sort 3 1.0 1.5528e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 1 0 0 0 0 1 0 Move A 3 1.0 1.1300e-03 1.0 0.00e+00 0.0 3.0e+01 6.6e+01 5.1e+01 0 0 0 0 4 0 0 0 0 4 0 Move P 3 1.0 7.7750e-04 1.0 0.00e+00 0.0 1.8e+01 2.6e+01 5.1e+01 0 0 0 0 4 0 0 0 0 4 0 PCSetUp 6 1.0 7.9483e-01 1.0 1.22e+08 1.1 4.1e+03 5.7e+03 9.9e+02 1 19 34 44 69 1 19 34 44 70 590 PCSetUpOnBlocks 24 1.0 5.2736e-04 1.1 1.80e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 24 1.0 4.0983e-01 1.0 5.07e+08 1.1 7.6e+03 2.5e+03 2.9e+02 0 76 62 36 20 0 76 62 36 20 4727 SFSetGraph 9 1.0 2.1600e-06 1.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFSetUp 9 1.0 1.6055e-03 1.7 0.00e+00 0.0 1.6e+02 1.8e+03 0.0e+00 0 0 1 1 0 0 0 1 1 0 0 SFBcastBegin 45 1.0 5.1013e-04 1.1 0.00e+00 0.0 5.4e+02 2.4e+03 0.0e+00 0 0 4 2 0 0 0 4 2 0 0 SFBcastEnd 45 1.0 7.3911e-04 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 429 429 16885896 0. Matrix 252 252 628801440 0. Matrix Coarsen 9 9 6156 0. Index Set 147 147 299856 0. Vec Scatter 57 57 79848 0. Krylov Solver 36 36 314928 0. Preconditioner 27 27 29544 0. Viewer 5 4 3584 0. PetscRandom 18 18 12492 0. Star Forest Graph 9 9 8496 0. ======================================================================================================================== Average time to get PetscTime(): 4.1601e-08 Average time for MPI_Barrier(): 1.76301e-06 Average time for zero size MPI_Send(): 1.59626e-06 #PETSc Option Table entries: --prefix run_a0b0c0d0e0f0g0h0i0_n4_l3 -aggrmeth alla_serial -beta 10.0 -betaest .true. -check .false. -datadt data_distribution_fully_assembled -dm 3 -dom -1.0 -in_space .true. -ksp_converged_reason -ksp_max_it 500 -ksp_monitor -ksp_norm_type unpreconditioned -ksp_rtol 1.0e-6 -ksp_type cg -ksp_view -l 1 -levelset popcorn -levelsettol 1.0e-6 -log_view -lsdom 0.0 -maxl 6 -mg_coarse_sub_pc_factor_mat_ordering_type nd -mg_coarse_sub_pc_type cholesky -mg_levels_esteig_ksp_type cg -no_signal_handler -nruns 3 -order 1 -pc_gamg_agg_nsmooths 1 -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 0 -pc_gamg_type agg -pc_type gamg -petscrc /gpfs/scratch/upc26/upc26229/par_cell_aggr_poisson/paper/weak_scal_ompi/2nd-w-scal/petscrc-0 -tt 1 -uagg .true. -wratio 10 -wsolution .false. #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 8 Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 -with-blaslapack-dir=/apps/INTEL/2017.4/mkl --with-debugging=0 --with-x=0 --with-shared-libraries=1 --with-mpi=1 --with-64-bit-indices ----------------------------------------- Libraries compiled on 2018-06-04 18:55:32 on login1 Machine characteristics: Linux-4.4.103-92.56-default-x86_64-with-SuSE-12-x86_64 Using PETSc directory: /gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0 Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -wd1572 -g -O3 Using Fortran compiler: mpif90 -fPIC -g -O3 ----------------------------------------- Using include paths: -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/include -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/apps/INTEL/2017.4/mkl/lib/intel64 -L/apps/INTEL/2017.4/mkl/lib/intel64 -Wl,-rpath,/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -L/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -Wl,-rpath,/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -L/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -Wl,-rpath,/usr/lib64/gcc/x86_64-suse-linux/4.8 -L/usr/lib64/gcc/x86_64-suse-linux/4.8 -Wl,-rpath,/usr/x86_64-suse-linux/lib -L/usr/x86_64-suse-linux/lib -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lifport -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lpthread -lgcc_s -lirc_s -lstdc++ -ldl ----------------------------------------- -------------- next part -------------- Linear solve converged due to CONVERGED_RTOL iterations 9 KSP Object: 32 MPI processes type: cg maximum iterations=500, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 32 MPI processes type: gamg type is MULTIPLICATIVE, levels=4 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 32 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 32 MPI processes type: bjacobi number of blocks = 32 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=35, cols=35 package used to perform factorization: petsc total: nonzeros=630, allocated nonzeros=630 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=35, cols=35 total: nonzeros=1225, allocated nonzeros=1225 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 7 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 32 MPI processes type: mpiaij rows=35, cols=35 total: nonzeros=1225, allocated nonzeros=1225 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 7 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 32 MPI processes type: chebyshev eigenvalue estimates used: min = 0.140301, max = 1.54331 eigenvalues estimate via cg min 0.150843, max 1.40301 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 32 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 32 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 32 MPI processes type: mpiaij rows=1654, cols=1654 total: nonzeros=302008, allocated nonzeros=302008 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation using I-node (on process 0) routines: found 15 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 32 MPI processes type: chebyshev eigenvalue estimates used: min = 0.135428, max = 1.48971 eigenvalues estimate via cg min 0.0330649, max 1.35428 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 32 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 32 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 32 MPI processes type: mpiaij rows=38899, cols=38899 total: nonzeros=3088735, allocated nonzeros=3088735 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 32 MPI processes type: chebyshev eigenvalue estimates used: min = 0.196606, max = 2.16267 eigenvalues estimate via cg min 0.0475838, max 1.96606 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 32 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 32 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 32 MPI processes type: mpiaij rows=508459, cols=508459 total: nonzeros=16204885, allocated nonzeros=305075400 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 32 MPI processes type: mpiaij rows=508459, cols=508459 total: nonzeros=16204885, allocated nonzeros=305075400 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /gpfs/scratch/upc26/upc26229/build_rel_fempar_cell_agg_ompi/FEMPAR/bin/par_test_h_adaptive_poisson_unfitted on a arch-linux2-c-opt named s15r1b25 with 32 processors, by upc26229 Wed Nov 7 01:09:23 2018 Using Petsc Release Version 3.9.0, Apr, 07, 2018 Max Max/Min Avg Total Time (sec): 1.621e+02 1.00000 1.621e+02 Objects: 9.890e+02 1.00304 9.861e+02 Flop: 7.802e+08 2.25680 6.170e+08 1.974e+10 Flop/sec: 4.812e+06 2.25680 3.806e+06 1.218e+08 MPI Messages: 2.457e+04 2.04836 1.917e+04 6.134e+05 MPI Message Lengths: 3.844e+07 2.16684 1.506e+03 9.236e+08 MPI Reductions: 1.469e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.6212e+02 100.0% 1.9744e+10 100.0% 6.134e+05 100.0% 1.506e+03 100.0% 1.456e+03 99.1% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSided 9 1.0 7.2859e-03 6.2 0.00e+00 0.0 2.5e+03 8.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSidedF 75 1.0 7.6260e+0063.2 0.00e+00 0.0 1.2e+04 2.1e+04 0.0e+00 2 0 2 27 0 2 0 2 27 0 0 VecMDot 90 1.0 3.3601e-02 7.0 7.73e+06 3.0 0.0e+00 0.0e+00 9.0e+01 0 1 0 0 6 0 1 0 0 6 5391 VecTDot 243 1.0 1.1469e-0117.2 5.28e+06 3.0 0.0e+00 0.0e+00 2.4e+02 0 1 0 0 17 0 1 0 0 17 1082 VecNorm 228 1.0 6.2426e-02 8.2 4.39e+06 3.0 0.0e+00 0.0e+00 2.3e+02 0 1 0 0 16 0 1 0 0 16 1650 VecScale 99 1.0 1.0887e-03 6.2 7.73e+05 3.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 16641 VecCopy 114 1.0 1.3520e-03 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 438 1.0 1.0085e-03 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 243 1.0 6.7886e-03 3.4 5.28e+06 3.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 18279 VecAYPX 753 1.0 9.3539e-03 3.1 8.63e+06 3.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 21626 VecAXPBYCZ 324 1.0 5.0166e-03 2.0 1.26e+07 3.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 59097 VecMAXPY 99 1.0 4.4716e-03 4.9 9.14e+06 3.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 47884 VecAssemblyBegin 24 1.0 1.1001e-02 2.2 0.00e+00 0.0 1.7e+03 2.0e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 24 1.0 3.4029e-04 5.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 99 1.0 1.1331e-03 2.3 7.73e+05 3.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 15989 VecScatterBegin 885 1.0 3.6779e-02 2.2 0.00e+00 0.0 4.6e+05 9.8e+02 0.0e+00 0 0 75 49 0 0 0 75 49 0 0 VecScatterEnd 885 1.0 3.2675e-0112.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSetRandom 9 1.0 2.0615e-03 3.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 99 1.0 8.0360e-03 3.0 2.32e+06 3.0 0.0e+00 0.0e+00 9.9e+01 0 0 0 0 7 0 0 0 0 7 6764 MatMult 693 1.0 4.7642e-01 1.3 3.73e+08 2.2 3.9e+05 1.1e+03 0.0e+00 0 48 64 46 0 0 48 64 46 0 19814 MatMultAdd 81 1.0 2.5011e-02 1.8 9.13e+06 2.9 2.7e+04 2.4e+02 0.0e+00 0 1 4 1 0 0 1 4 1 0 8668 MatMultTranspose 81 1.0 4.0727e-02 2.3 9.13e+06 2.9 2.7e+04 2.4e+02 0.0e+00 0 1 4 1 0 0 1 4 1 0 5323 MatSolve 27 0.0 2.1556e-04 0.0 6.52e+04 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 302 MatSOR 585 1.0 4.9140e-01 1.9 2.65e+08 2.2 0.0e+00 0.0e+00 0.0e+00 0 34 0 0 0 0 34 0 0 0 13575 MatCholFctrSym 3 1.0 6.5050e-03313.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 3 1.0 3.2232e-03996.2 1.05e+02 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 9 1.0 2.4005e-02 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 27 1.0 1.1253e-02 2.1 5.57e+06 2.3 5.1e+03 1.0e+03 0.0e+00 0 1 1 1 0 0 1 1 1 0 12589 MatResidual 81 1.0 7.8485e-02 2.2 4.17e+07 2.2 4.6e+04 1.0e+03 0.0e+00 0 5 7 5 0 0 5 7 5 0 13482 MatAssemblyBegin 168 1.0 7.7038e+0036.5 0.00e+00 0.0 9.9e+03 2.5e+04 0.0e+00 2 0 2 26 0 2 0 2 26 0 0 MatAssemblyEnd 168 1.0 2.8208e-01 1.5 0.00e+00 0.0 3.7e+04 3.1e+02 3.6e+02 0 0 6 1 25 0 0 6 1 25 0 MatGetRow 210816 3.0 2.5672e-02 2.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 3 0.0 2.7154e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 6 1.0 1.1316e-02 1.0 0.00e+00 0.0 5.9e+02 1.4e+02 9.6e+01 0 0 0 0 7 0 0 0 0 7 0 MatGetOrdering 3 0.0 7.1425e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 9 1.0 2.3828e-02 1.2 0.00e+00 0.0 5.4e+04 7.5e+02 6.0e+01 0 0 9 4 4 0 0 9 4 4 0 MatZeroEntries 9 1.0 1.7683e-03 2.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 21 1.4 5.7901e-03 4.3 0.00e+00 0.0 0.0e+00 0.0e+00 1.5e+01 0 0 0 0 1 0 0 0 0 1 0 MatAXPY 9 1.0 2.3933e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 9 1.0 1.0349e-01 1.0 4.64e+06 2.2 3.0e+04 7.4e+02 1.1e+02 0 1 5 2 7 0 1 5 2 7 1136 MatMatMultSym 9 1.0 8.4956e-02 1.0 0.00e+00 0.0 2.4e+04 6.8e+02 1.1e+02 0 0 4 2 7 0 0 4 2 7 0 MatMatMultNum 9 1.0 1.9249e-02 1.1 4.64e+06 2.2 5.1e+03 1.0e+03 0.0e+00 0 1 1 1 0 0 1 1 1 0 6108 MatPtAP 9 1.0 5.3590e-01 1.0 6.57e+07 2.3 4.7e+04 5.9e+03 1.4e+02 0 8 8 30 9 0 8 8 30 9 3097 MatPtAPSymbolic 9 1.0 3.5159e-01 1.0 0.00e+00 0.0 2.9e+04 5.2e+03 6.3e+01 0 0 5 16 4 0 0 5 16 4 0 MatPtAPNumeric 9 1.0 1.8422e-01 1.0 6.57e+07 2.3 1.8e+04 6.9e+03 7.2e+01 0 8 3 14 5 0 8 3 14 5 9008 MatGetLocalMat 27 1.0 1.3379e-02 3.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 27 1.0 2.5772e-02 1.3 0.00e+00 0.0 3.6e+04 3.6e+03 0.0e+00 0 0 6 14 0 0 0 6 14 0 0 KSPGMRESOrthog 90 1.0 3.4506e-02 4.2 1.55e+07 3.0 0.0e+00 0.0e+00 9.0e+01 0 2 0 0 6 0 2 0 0 6 10501 KSPSetUp 36 1.0 7.0966e-03 2.6 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 2 0 0 0 0 2 0 KSPSolve 3 1.0 9.5439e-01 1.0 6.41e+08 2.2 3.9e+05 9.8e+02 3.7e+02 1 82 64 42 25 1 82 64 42 26 16969 PCGAMGGraph_AGG 9 1.0 2.9486e-01 1.0 4.64e+06 2.2 1.5e+04 6.9e+02 1.1e+02 0 1 2 1 7 0 1 2 1 7 399 PCGAMGCoarse_AGG 9 1.0 2.6490e-02 1.1 0.00e+00 0.0 5.4e+04 7.5e+02 6.0e+01 0 0 9 4 4 0 0 9 4 4 0 PCGAMGProl_AGG 9 1.0 8.0748e-02 1.0 0.00e+00 0.0 1.5e+04 8.1e+02 1.4e+02 0 0 2 1 10 0 0 2 1 10 0 PCGAMGPOpt_AGG 9 1.0 2.4086e-01 1.0 6.97e+07 2.3 8.1e+04 9.3e+02 3.7e+02 0 9 13 8 25 0 9 13 8 25 7357 GAMG: createProl 9 1.0 6.4623e-01 1.0 7.44e+07 2.3 1.7e+05 8.4e+02 6.8e+02 0 10 27 15 46 0 10 27 15 47 2924 Graph 18 1.0 2.8964e-01 1.0 4.64e+06 2.2 1.5e+04 6.9e+02 1.1e+02 0 1 2 1 7 0 1 2 1 7 406 MIS/Agg 9 1.0 2.3962e-02 1.2 0.00e+00 0.0 5.4e+04 7.5e+02 6.0e+01 0 0 9 4 4 0 0 9 4 4 0 SA: col data 9 1.0 9.0052e-03 1.1 0.00e+00 0.0 1.0e+04 1.0e+03 3.6e+01 0 0 2 1 2 0 0 2 1 2 0 SA: frmProl0 9 1.0 6.9267e-02 1.0 0.00e+00 0.0 4.7e+03 3.3e+02 7.2e+01 0 0 1 0 5 0 0 1 0 5 0 SA: smooth 9 1.0 1.3371e-01 1.0 5.57e+06 2.3 3.0e+04 7.4e+02 1.3e+02 0 1 5 2 9 0 1 5 2 9 1059 GAMG: partLevel 9 1.0 5.5452e-01 1.0 6.57e+07 2.3 4.8e+04 5.8e+03 2.9e+02 0 8 8 30 20 0 8 8 30 20 2993 repartition 3 1.0 1.7596e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 Invert-Sort 3 1.0 6.5844e-04 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 1 0 0 0 0 1 0 Move A 3 1.0 6.4088e-03 1.1 0.00e+00 0.0 4.0e+02 1.6e+02 5.1e+01 0 0 0 0 3 0 0 0 0 4 0 Move P 3 1.0 5.6291e-03 1.1 0.00e+00 0.0 1.9e+02 9.1e+01 5.1e+01 0 0 0 0 3 0 0 0 0 4 0 PCSetUp 6 1.0 1.2264e+00 1.0 1.40e+08 2.3 2.1e+05 2.0e+03 1.0e+03 1 18 35 45 70 1 18 35 45 70 2894 PCSetUpOnBlocks 27 1.0 7.7373e-03 1.6 1.05e+02 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 27 1.0 9.0805e-01 1.1 6.00e+08 2.2 3.8e+05 9.3e+02 2.9e+02 1 77 62 38 20 1 77 62 38 20 16705 SFSetGraph 9 1.0 2.6450e-06 2.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFSetUp 9 1.0 8.5383e-03 2.5 0.00e+00 0.0 7.6e+03 6.9e+02 0.0e+00 0 0 1 1 0 0 0 1 1 0 0 SFBcastBegin 78 1.0 4.1889e-03 2.7 0.00e+00 0.0 4.7e+04 7.6e+02 0.0e+00 0 0 8 4 0 0 0 8 4 0 0 SFBcastEnd 78 1.0 5.7714e-03 3.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 429 429 10070544 0. Matrix 252 252 362421120 0. Matrix Coarsen 9 9 6156 0. Index Set 147 147 365304 0. Vec Scatter 57 57 79800 0. Krylov Solver 36 36 314928 0. Preconditioner 27 27 29544 0. Viewer 5 4 3584 0. PetscRandom 18 18 12492 0. Star Forest Graph 9 9 8496 0. ======================================================================================================================== Average time to get PetscTime(): 4.1537e-08 Average time for MPI_Barrier(): 3.98215e-06 Average time for zero size MPI_Send(): 1.49463e-06 #PETSc Option Table entries: --prefix run_a0b0c0d0e0f0g0h0i0_n5_l3 -aggrmeth alla_serial -beta 10.0 -betaest .true. -check .false. -datadt data_distribution_fully_assembled -dm 3 -dom -1.0 -in_space .true. -ksp_converged_reason -ksp_max_it 500 -ksp_monitor -ksp_norm_type unpreconditioned -ksp_rtol 1.0e-6 -ksp_type cg -ksp_view -l 1 -levelset popcorn -levelsettol 1.0e-6 -log_view -lsdom 0.0 -maxl 7 -mg_coarse_sub_pc_factor_mat_ordering_type nd -mg_coarse_sub_pc_type cholesky -mg_levels_esteig_ksp_type cg -no_signal_handler -nruns 3 -order 1 -pc_gamg_agg_nsmooths 1 -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 0 -pc_gamg_type agg -pc_type gamg -petscrc /gpfs/scratch/upc26/upc26229/par_cell_aggr_poisson/paper/weak_scal_ompi/2nd-w-scal/petscrc-0 -tt 1 -uagg .true. -wratio 10 -wsolution .false. #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 8 Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 -with-blaslapack-dir=/apps/INTEL/2017.4/mkl --with-debugging=0 --with-x=0 --with-shared-libraries=1 --with-mpi=1 --with-64-bit-indices ----------------------------------------- Libraries compiled on 2018-06-04 18:55:32 on login1 Machine characteristics: Linux-4.4.103-92.56-default-x86_64-with-SuSE-12-x86_64 Using PETSc directory: /gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0 Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -wd1572 -g -O3 Using Fortran compiler: mpif90 -fPIC -g -O3 ----------------------------------------- Using include paths: -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/include -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/apps/INTEL/2017.4/mkl/lib/intel64 -L/apps/INTEL/2017.4/mkl/lib/intel64 -Wl,-rpath,/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -L/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -Wl,-rpath,/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -L/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -Wl,-rpath,/usr/lib64/gcc/x86_64-suse-linux/4.8 -L/usr/lib64/gcc/x86_64-suse-linux/4.8 -Wl,-rpath,/usr/x86_64-suse-linux/lib -L/usr/x86_64-suse-linux/lib -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lifport -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lpthread -lgcc_s -lirc_s -lstdc++ -ldl ----------------------------------------- Ending run at Wed Nov 7 01:09:23 CET 2018 Ending script at Wed Nov 7 01:09:23 CET 2018 -------------- next part -------------- KSP Object: 262 MPI processes type: cg maximum iterations=500, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 262 MPI processes type: gamg type is MULTIPLICATIVE, levels=5 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 262 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 262 MPI processes type: bjacobi number of blocks = 262 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=4, cols=4 package used to perform factorization: petsc total: nonzeros=10, allocated nonzeros=10 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 262 MPI processes type: mpiaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 1 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 262 MPI processes type: chebyshev eigenvalue estimates used: min = 0.129006, max = 1.41907 eigenvalues estimate via cg min 0.482341, max 1.29006 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 262 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 262 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 262 MPI processes type: mpiaij rows=284, cols=284 total: nonzeros=47942, allocated nonzeros=47942 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 262 MPI processes type: chebyshev eigenvalue estimates used: min = 0.160435, max = 1.76479 eigenvalues estimate via cg min 0.0880722, max 1.60435 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 262 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 262 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 262 MPI processes type: mpiaij rows=13842, cols=13842 total: nonzeros=2801068, allocated nonzeros=2801068 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 262 MPI processes type: chebyshev eigenvalue estimates used: min = 0.135811, max = 1.49392 eigenvalues estimate via cg min 0.036202, max 1.35811 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 262 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 262 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 262 MPI processes type: mpiaij rows=319856, cols=319856 total: nonzeros=25116236, allocated nonzeros=25116236 total number of mallocs used during MatSetValues calls =0 using scalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 262 MPI processes type: chebyshev eigenvalue estimates used: min = 0.298538, max = 3.28392 eigenvalues estimate via cg min 0.0506704, max 2.98538 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_4_esteig_) 262 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 262 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 262 MPI processes type: mpiaij rows=4068981, cols=4068981 total: nonzeros=120055495, allocated nonzeros=2441388600 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 262 MPI processes type: mpiaij rows=4068981, cols=4068981 total: nonzeros=120055495, allocated nonzeros=2441388600 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /gpfs/scratch/upc26/upc26229/build_rel_fempar_cell_agg_ompi/FEMPAR/bin/par_test_h_adaptive_poisson_unfitted on a arch-linux2-c-opt named s07r1b55 with 262 processors, by upc26229 Wed Nov 7 01:14:20 2018 Using Petsc Release Version 3.9.0, Apr, 07, 2018 Max Max/Min Avg Total Time (sec): 2.359e+02 1.00000 2.359e+02 Objects: 1.355e+03 1.00222 1.352e+03 Flop: 8.832e+08 0.00000 6.569e+08 1.721e+11 Flop/sec: 3.743e+06 0.00000 2.784e+06 7.295e+08 MPI Messages: 5.577e+04 5069.63636 3.334e+04 8.736e+06 MPI Message Lengths: 4.973e+07 1130198.90909 1.019e+03 8.904e+09 MPI Reductions: 2.072e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 2.3593e+02 100.0% 1.7210e+11 100.0% 8.736e+06 100.0% 1.019e+03 100.0% 2.059e+03 99.4% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSided 12 1.0 1.6995e-02 7.1 0.00e+00 0.0 3.0e+04 8.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSidedF 102 1.0 1.0505e+0133.9 0.00e+00 0.0 1.3e+05 1.6e+04 0.0e+00 3 0 1 24 0 3 0 1 24 0 0 VecMDot 120 1.0 1.0321e-01 9.5 7.43e+06 0.0 0.0e+00 0.0e+00 1.2e+02 0 1 0 0 6 0 1 0 0 6 14077 VecTDot 318 1.0 1.3620e+0061.0 5.57e+06 0.0 0.0e+00 0.0e+00 3.2e+02 0 1 0 0 15 0 1 0 0 15 802 VecNorm 300 1.0 2.2448e-0110.2 4.46e+06 0.0 0.0e+00 0.0e+00 3.0e+02 0 1 0 0 14 0 1 0 0 15 3894 VecScale 132 1.0 4.2758e-0329.9 7.43e+05 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 33981 VecCopy 174 1.0 1.9950e-0355.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 651 1.0 1.4187e-0314.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 318 1.0 2.0724e-02148.6 5.57e+06 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 52686 VecAYPX 1194 1.0 1.5941e-02106.7 9.89e+06 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 121379 VecAXPBYCZ 528 1.0 9.4921e-03156.6 1.49e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 306144 VecMAXPY 132 1.0 7.9654e-0356.3 8.78e+06 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 215576 VecAssemblyBegin 33 1.0 1.9882e-02 3.0 0.00e+00 0.0 1.7e+04 1.7e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 33 1.0 4.4571e-03149.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 132 1.0 1.8402e-0362.2 7.43e+05 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 78959 VecScatterBegin 1371 1.0 6.3832e-02215.6 0.00e+00 0.0 6.4e+06 7.3e+02 0.0e+00 0 0 73 52 0 0 0 73 52 0 0 VecScatterEnd 1371 1.0 1.2430e+006808.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSetRandom 12 1.0 2.0955e-03958.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 132 1.0 2.2506e-02 4.3 2.23e+06 0.0 0.0e+00 0.0e+00 1.3e+02 0 0 0 0 6 0 0 0 0 6 19368 MatMult 1065 1.0 1.1689e+001737.8 4.26e+08 0.0 5.4e+06 8.1e+02 0.0e+00 0 48 62 49 0 0 48 62 49 0 71034 MatMultAdd 132 1.0 8.6626e-021090.7 1.08e+07 0.0 3.7e+05 1.9e+02 0.0e+00 0 1 4 1 0 0 1 4 1 0 24436 MatMultTranspose 132 1.0 5.6369e-013353.5 1.08e+07 0.0 3.7e+05 1.9e+02 0.0e+00 0 1 4 1 0 0 1 4 1 0 3755 MatSolve 33 0.0 5.7641e-05 0.0 9.24e+02 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 16 MatSOR 924 1.0 7.8933e-015774.2 3.08e+08 0.0 0.0e+00 0.0e+00 0.0e+00 0 34 0 0 0 0 34 0 0 0 74391 MatCholFctrSym 3 1.0 1.0960e-02545.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 3 1.0 6.4946e-032309.9 1.20e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 12 1.0 3.4631e-0230.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 36 1.0 2.6028e-02184.6 5.48e+06 0.0 6.1e+04 7.7e+02 0.0e+00 0 1 1 1 0 0 1 1 1 0 41516 MatResidual 132 1.0 1.8136e-011415.7 5.00e+07 0.0 6.7e+05 7.7e+02 0.0e+00 0 6 8 6 0 0 6 8 6 0 53866 MatAssemblyBegin 231 1.0 1.0487e+0118.5 0.00e+00 0.0 1.1e+05 1.9e+04 0.0e+00 3 0 1 23 0 3 0 1 23 0 0 MatAssemblyEnd 231 1.0 4.6673e-01 1.6 0.00e+00 0.0 5.3e+05 2.0e+02 5.0e+02 0 0 6 1 24 0 0 6 1 24 0 MatGetRow 202635 0.0 2.5265e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 3 0.0 1.2548e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 12 1.0 5.2781e-02 1.0 0.00e+00 0.0 5.4e+03 4.8e+02 1.9e+02 0 0 0 0 9 0 0 0 0 9 0 MatGetOrdering 3 0.0 1.7776e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 12 1.0 4.3461e-02 1.3 0.00e+00 0.0 1.1e+06 4.1e+02 1.2e+02 0 0 13 5 6 0 0 13 5 6 0 MatZeroEntries 12 1.0 1.7174e-03325.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 24 1.3 6.3856e-02 3.2 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 MatAXPY 12 1.0 2.7936e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 12 1.0 3.3930e-01 1.1 4.54e+06 0.0 3.5e+05 5.5e+02 1.5e+02 0 1 4 2 7 0 1 4 2 7 2618 MatMatMultSym 12 1.0 2.5782e-01 1.0 0.00e+00 0.0 2.9e+05 5.1e+02 1.4e+02 0 0 3 2 7 0 0 3 2 7 0 MatMatMultNum 12 1.0 5.1971e-02 1.1 4.54e+06 0.0 6.1e+04 7.7e+02 0.0e+00 0 1 1 1 0 0 1 1 1 0 17089 MatPtAP 12 1.0 1.0058e+00 1.0 6.57e+07 0.0 6.6e+05 3.9e+03 1.8e+02 0 7 8 29 9 0 7 8 29 9 12715 MatPtAPSymbolic 12 1.0 5.2416e-01 1.0 0.00e+00 0.0 3.5e+05 4.0e+03 8.4e+01 0 0 4 16 4 0 0 4 16 4 0 MatPtAPNumeric 12 1.0 4.7255e-01 1.0 6.57e+07 0.0 3.2e+05 3.8e+03 9.6e+01 0 7 4 13 5 0 7 4 13 5 27062 MatGetLocalMat 36 1.0 1.3884e-0237.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 36 1.0 5.1478e-0293.5 0.00e+00 0.0 4.3e+05 2.7e+03 0.0e+00 0 0 5 13 0 0 0 5 13 0 0 KSPGMRESOrthog 120 1.0 1.0809e-01 7.9 1.49e+07 0.0 0.0e+00 0.0e+00 1.2e+02 0 2 0 0 6 0 2 0 0 6 26883 KSPSetUp 45 1.0 1.8355e-02 6.9 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 1 0 0 0 0 1 0 KSPSolve 3 1.0 1.5817e+00 1.0 7.46e+08 0.0 5.6e+06 7.3e+02 4.9e+02 1 84 64 45 23 1 84 64 45 24 91555 PCGAMGGraph_AGG 12 1.0 3.4556e-01 1.0 4.54e+06 0.0 1.8e+05 5.2e+02 1.4e+02 0 1 2 1 7 0 1 2 1 7 2570 PCGAMGCoarse_AGG 12 1.0 4.4724e-02 1.2 0.00e+00 0.0 1.1e+06 4.1e+02 1.2e+02 0 0 13 5 6 0 0 13 5 6 0 PCGAMGProl_AGG 12 1.0 3.5267e-01 1.0 0.00e+00 0.0 1.7e+05 6.4e+02 1.9e+02 0 0 2 1 9 0 0 2 1 9 0 PCGAMGPOpt_AGG 12 1.0 5.1901e-01 1.0 6.89e+07 0.0 9.6e+05 6.9e+02 5.0e+02 0 8 11 7 24 0 8 11 7 24 26218 GAMG: createProl 12 1.0 1.2634e+00 1.0 7.34e+07 0.0 2.4e+06 5.4e+02 9.5e+02 1 8 28 15 46 1 8 28 15 46 11473 Graph 24 1.0 3.3748e-01 1.0 4.54e+06 0.0 1.8e+05 5.2e+02 1.4e+02 0 1 2 1 7 0 1 2 1 7 2632 MIS/Agg 12 1.0 4.3569e-02 1.3 0.00e+00 0.0 1.1e+06 4.1e+02 1.2e+02 0 0 13 5 6 0 0 13 5 6 0 SA: col data 12 1.0 1.2331e-02 1.1 0.00e+00 0.0 1.2e+05 7.7e+02 4.8e+01 0 0 1 1 2 0 0 1 1 2 0 SA: frmProl0 12 1.0 3.3684e-01 1.0 0.00e+00 0.0 4.7e+04 2.9e+02 9.6e+01 0 0 1 0 5 0 0 1 0 5 0 SA: smooth 12 1.0 3.7167e-01 1.1 5.48e+06 0.0 3.5e+05 5.5e+02 1.7e+02 0 1 4 2 8 0 1 4 2 8 2907 GAMG: partLevel 12 1.0 1.0910e+00 1.0 6.57e+07 0.0 6.7e+05 3.9e+03 4.9e+02 0 7 8 29 24 0 7 8 29 24 11721 repartition 6 1.0 8.1465e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 0 0 0 0 2 0 0 0 0 2 0 Invert-Sort 6 1.0 3.8278e-03 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 1 0 0 0 0 1 0 Move A 6 1.0 4.3095e-02 1.0 0.00e+00 0.0 2.9e+03 8.1e+02 1.0e+02 0 0 0 0 5 0 0 0 0 5 0 Move P 6 1.0 1.6401e-02 1.1 0.00e+00 0.0 2.5e+03 9.0e+01 1.0e+02 0 0 0 0 5 0 0 0 0 5 0 PCSetUp 6 1.0 2.4101e+00 1.0 1.38e+08 0.0 3.1e+06 1.3e+03 1.5e+03 1 16 36 44 73 1 16 36 44 73 11321 PCSetUpOnBlocks 33 1.0 1.9296e-02 5.9 1.20e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 33 1.0 1.4915e+00 5.2 6.98e+08 0.0 5.4e+06 6.9e+02 3.8e+02 1 79 62 42 19 1 79 62 42 19 90795 SFSetGraph 12 1.0 3.8510e-06 3.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFSetUp 12 1.0 1.7290e-02 3.4 0.00e+00 0.0 9.1e+04 5.2e+02 0.0e+00 0 0 1 1 0 0 0 1 1 0 0 SFBcastBegin 144 1.0 8.3527e-0354.1 0.00e+00 0.0 1.0e+06 4.0e+02 0.0e+00 0 0 12 5 0 0 0 12 5 0 0 SFBcastEnd 144 1.0 1.6454e-02500.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 573 573 1139304 0. Matrix 348 348 6730200 0. Matrix Coarsen 12 12 8208 0. Index Set 222 222 228240 0. Vec Scatter 81 81 113448 0. Krylov Solver 45 45 415944 0. Preconditioner 33 33 35448 0. Viewer 5 4 3584 0. PetscRandom 24 24 16656 0. Star Forest Graph 12 12 11328 0. ======================================================================================================================== Average time to get PetscTime(): 4.40516e-08 Average time for MPI_Barrier(): 1.38046e-05 Average time for zero size MPI_Send(): 1.46144e-06 #PETSc Option Table entries: --prefix run_a0b0c0d0e0f0g0h0i0_n6_l3 -aggrmeth alla_serial -beta 10.0 -betaest .true. -check .false. -datadt data_distribution_fully_assembled -dm 3 -dom -1.0 -in_space .true. -ksp_converged_reason -ksp_max_it 500 -ksp_monitor -ksp_norm_type unpreconditioned -ksp_rtol 1.0e-6 -ksp_type cg -ksp_view -l 1 -levelset popcorn -levelsettol 1.0e-6 -log_view -lsdom 0.0 -maxl 8 -mg_coarse_sub_pc_factor_mat_ordering_type nd -mg_coarse_sub_pc_type cholesky -mg_levels_esteig_ksp_type cg -no_signal_handler -nruns 3 -order 1 -pc_gamg_agg_nsmooths 1 -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 0 -pc_gamg_type agg -pc_type gamg -petscrc /gpfs/scratch/upc26/upc26229/par_cell_aggr_poisson/paper/weak_scal_ompi/2nd-w-scal/petscrc-0 -tt 1 -uagg .true. -wratio 10 -wsolution .false. #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 8 Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 -with-blaslapack-dir=/apps/INTEL/2017.4/mkl --with-debugging=0 --with-x=0 --with-shared-libraries=1 --with-mpi=1 --with-64-bit-indices ----------------------------------------- Libraries compiled on 2018-06-04 18:55:32 on login1 Machine characteristics: Linux-4.4.103-92.56-default-x86_64-with-SuSE-12-x86_64 Using PETSc directory: /gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0 Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -wd1572 -g -O3 Using Fortran compiler: mpif90 -fPIC -g -O3 ----------------------------------------- Using include paths: -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/include -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/apps/INTEL/2017.4/mkl/lib/intel64 -L/apps/INTEL/2017.4/mkl/lib/intel64 -Wl,-rpath,/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -L/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -Wl,-rpath,/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -L/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -Wl,-rpath,/usr/lib64/gcc/x86_64-suse-linux/4.8 -L/usr/lib64/gcc/x86_64-suse-linux/4.8 -Wl,-rpath,/usr/x86_64-suse-linux/lib -L/usr/x86_64-suse-linux/lib -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lifport -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lpthread -lgcc_s -lirc_s -lstdc++ -ldl ----------------------------------------- Ending run at Wed Nov 7 01:14:21 CET 2018 Ending script at Wed Nov 7 01:14:21 CET 2018 -------------- next part -------------- KSP Object: 2097 MPI processes type: cg maximum iterations=500, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 2097 MPI processes type: gamg type is MULTIPLICATIVE, levels=5 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 2097 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 2097 MPI processes type: bjacobi number of blocks = 2097 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=36, cols=36 package used to perform factorization: petsc total: nonzeros=666, allocated nonzeros=666 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=36, cols=36 total: nonzeros=1296, allocated nonzeros=1296 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 8 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 2097 MPI processes type: mpiaij rows=36, cols=36 total: nonzeros=1296, allocated nonzeros=1296 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 8 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 2097 MPI processes type: chebyshev eigenvalue estimates used: min = 0.167617, max = 1.84379 eigenvalues estimate via cg min 0.106378, max 1.67617 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 2097 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 2097 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 2097 MPI processes type: mpiaij rows=2304, cols=2304 total: nonzeros=598220, allocated nonzeros=598220 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 2097 MPI processes type: chebyshev eigenvalue estimates used: min = 0.14326, max = 1.57586 eigenvalues estimate via cg min 0.0397147, max 1.4326 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 2097 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 2097 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 2097 MPI processes type: mpiaij rows=112580, cols=112580 total: nonzeros=23420598, allocated nonzeros=23420598 total number of mallocs used during MatSetValues calls =0 using scalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 2097 MPI processes type: chebyshev eigenvalue estimates used: min = 0.136096, max = 1.49705 eigenvalues estimate via cg min 0.0338371, max 1.36096 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 2097 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 2097 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 2097 MPI processes type: mpiaij rows=2597459, cols=2597459 total: nonzeros=202116477, allocated nonzeros=202116477 total number of mallocs used during MatSetValues calls =0 using scalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 2097 MPI processes type: chebyshev eigenvalue estimates used: min = 0.335857, max = 3.69443 eigenvalues estimate via cg min 0.0542715, max 3.35857 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_4_esteig_) 2097 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 2097 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 2097 MPI processes type: mpiaij rows=32552439, cols=32552439 total: nonzeros=920267663, allocated nonzeros=19531463400 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 2097 MPI processes type: mpiaij rows=32552439, cols=32552439 total: nonzeros=920267663, allocated nonzeros=19531463400 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /gpfs/scratch/upc26/upc26229/build_rel_fempar_cell_agg_ompi/FEMPAR/bin/par_test_h_adaptive_poisson_unfitted on a arch-linux2-c-opt named s07r2b02 with 2097 processors, by upc26229 Wed Nov 7 01:15:12 2018 Using Petsc Release Version 3.9.0, Apr, 07, 2018 Max Max/Min Avg Total Time (sec): 2.458e+02 1.00000 2.458e+02 Objects: 1.355e+03 1.00222 1.352e+03 Flop: 9.818e+08 0.00000 6.789e+08 1.424e+12 Flop/sec: 3.994e+06 0.00000 2.762e+06 5.791e+09 MPI Messages: 1.103e+05 10027.81818 4.452e+04 9.336e+07 MPI Message Lengths: 5.692e+07 1293601.40909 8.203e+02 7.658e+10 MPI Reductions: 2.216e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 2.4583e+02 100.0% 1.4237e+12 100.0% 9.336e+07 100.0% 8.203e+02 100.0% 2.203e+03 99.4% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSided 12 1.0 4.6126e-02 2.1 0.00e+00 0.0 2.9e+05 8.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSidedF 102 1.0 1.1057e+0119.1 0.00e+00 0.0 1.1e+06 1.5e+04 0.0e+00 3 0 1 22 0 3 0 1 22 0 0 VecMDot 120 1.0 1.3730e-01 4.0 7.37e+06 0.0 0.0e+00 0.0e+00 1.2e+02 0 1 0 0 5 0 1 0 0 5 84752 VecTDot 324 1.0 1.8522e+0019.8 5.77e+06 0.0 0.0e+00 0.0e+00 3.2e+02 0 1 0 0 15 0 1 0 0 15 4929 VecNorm 303 1.0 3.3777e-01 3.6 4.55e+06 0.0 0.0e+00 0.0e+00 3.0e+02 0 1 0 0 14 0 1 0 0 14 21298 VecScale 132 1.0 6.1834e-0351.4 7.37e+05 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 188203 VecCopy 186 1.0 2.5507e-03101.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 696 1.0 4.9682e-0364.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 324 1.0 1.8679e-02143.7 5.77e+06 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 488834 VecAYPX 1293 1.0 1.8702e-02166.9 1.06e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 895518 VecAXPBYCZ 576 1.0 1.2857e-02233.9 1.61e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1974899 VecMAXPY 132 1.0 8.0905e-03119.3 8.71e+06 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1699935 VecAssemblyBegin 33 1.0 2.7789e-02 2.2 0.00e+00 0.0 1.3e+05 1.8e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 33 1.0 5.7797e-03196.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 132 1.0 3.0804e-03165.9 7.37e+05 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 377790 VecScatterBegin 1470 1.0 9.9138e-02609.8 0.00e+00 0.0 6.5e+07 6.3e+02 0.0e+00 0 0 69 53 0 0 0 69 53 0 0 VecScatterEnd 1470 1.0 1.6594e+0012314.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSetRandom 12 1.0 2.2327e-031311.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 132 1.0 5.1942e-02 1.7 2.21e+06 0.0 0.0e+00 0.0e+00 1.3e+02 0 0 0 0 6 0 0 0 0 6 67213 MatMult 1140 1.0 1.5379e+002703.7 4.83e+08 0.0 5.4e+07 7.0e+02 0.0e+00 0 48 58 50 0 0 48 58 50 0 447478 MatMultAdd 144 1.0 3.0789e-014466.1 1.18e+07 0.0 4.2e+06 1.5e+02 0.0e+00 0 1 5 1 0 0 1 5 1 0 59866 MatMultTranspose 144 1.0 7.5095e-015720.6 1.18e+07 0.0 4.2e+06 1.5e+02 0.0e+00 0 1 5 1 0 0 1 5 1 0 24545 MatSolve 36 0.0 1.6145e-04 0.0 9.20e+04 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 570 MatSOR 996 1.0 9.8081e-018037.0 3.40e+08 0.0 0.0e+00 0.0e+00 0.0e+00 0 34 0 0 0 0 34 0 0 0 497713 MatCholFctrSym 3 1.0 1.0764e-02556.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 3 1.0 7.9658e-032816.4 1.08e+02 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 12 1.0 3.8779e-0251.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 36 1.0 3.0256e-02564.4 5.74e+06 0.0 5.7e+05 6.7e+02 0.0e+00 0 1 1 1 0 0 1 1 1 0 278106 MatResidual 144 1.0 2.4749e-012126.4 5.77e+07 0.0 6.9e+06 6.7e+02 0.0e+00 0 6 7 6 0 0 6 7 6 0 333518 MatAssemblyBegin 231 1.0 1.1035e+0115.1 0.00e+00 0.0 9.7e+05 1.7e+04 0.0e+00 3 0 1 22 0 3 0 1 22 0 0 MatAssemblyEnd 231 1.0 5.6362e-01 1.5 0.00e+00 0.0 5.5e+06 1.6e+02 5.0e+02 0 0 6 1 23 0 0 6 1 23 0 MatGetRow 201060 0.0 2.9343e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 3 0.0 4.4128e-05 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 12 1.0 1.0294e-01 1.1 0.00e+00 0.0 2.6e+05 1.3e+02 1.9e+02 0 0 0 0 9 0 0 0 0 9 0 MatGetOrdering 3 0.0 1.7719e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 12 1.0 1.3539e-01 1.2 0.00e+00 0.0 1.7e+07 3.0e+02 2.5e+02 0 0 18 7 11 0 0 18 7 11 0 MatZeroEntries 12 1.0 1.8496e-03427.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 24 1.3 6.3806e-02 5.8 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 MatAXPY 12 1.0 3.7202e-02 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 12 1.0 7.9738e-01 1.5 4.80e+06 0.0 3.3e+06 4.9e+02 1.5e+02 0 0 3 2 7 0 0 3 2 7 8626 MatMatMultSym 12 1.0 4.7848e-01 1.0 0.00e+00 0.0 2.7e+06 4.5e+02 1.4e+02 0 0 3 2 6 0 0 3 2 7 0 MatMatMultNum 12 1.0 7.1849e-02 1.1 4.80e+06 0.0 5.7e+05 6.7e+02 0.0e+00 0 0 1 1 0 0 0 1 1 0 95734 MatPtAP 12 1.0 1.3269e+00 1.0 6.99e+07 0.0 6.5e+06 3.3e+03 1.9e+02 1 7 7 28 8 1 7 7 28 8 75292 MatPtAPSymbolic 12 1.0 7.0335e-01 1.0 0.00e+00 0.0 3.2e+06 3.6e+03 8.4e+01 0 0 3 15 4 0 0 3 15 4 0 MatPtAPNumeric 12 1.0 6.0478e-01 1.0 6.99e+07 0.0 3.3e+06 3.0e+03 9.6e+01 0 7 4 13 4 0 7 4 13 4 165193 MatGetLocalMat 36 1.0 1.9234e-0255.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 36 1.0 1.0496e-01201.1 0.00e+00 0.0 4.0e+06 2.4e+03 0.0e+00 0 0 4 13 0 0 0 4 13 0 0 KSPGMRESOrthog 120 1.0 1.4184e-01 3.7 1.47e+07 0.0 0.0e+00 0.0e+00 1.2e+02 0 2 0 0 5 0 2 0 0 5 164085 KSPSetUp 45 1.0 7.5013e-02 5.8 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 1 0 0 0 0 1 0 KSPSolve 3 1.0 2.1762e+00 1.0 8.36e+08 0.0 5.7e+07 6.3e+02 5.0e+02 1 85 61 47 22 1 85 61 47 22 556264 PCGAMGGraph_AGG 12 1.0 4.0398e-01 1.0 4.80e+06 0.0 1.7e+06 4.5e+02 1.4e+02 0 0 2 1 6 0 0 2 1 7 17026 PCGAMGCoarse_AGG 12 1.0 1.3612e-01 1.1 0.00e+00 0.0 1.7e+07 3.0e+02 2.5e+02 0 0 18 7 11 0 0 18 7 11 0 PCGAMGProl_AGG 12 1.0 4.2615e-01 1.0 0.00e+00 0.0 1.5e+06 5.7e+02 1.9e+02 0 0 2 1 9 0 0 2 1 9 0 PCGAMGPOpt_AGG 12 1.0 1.0610e+00 1.0 7.12e+07 0.0 9.0e+06 6.1e+02 5.0e+02 0 7 10 7 22 0 7 10 7 23 100280 GAMG: createProl 12 1.0 2.0248e+00 1.0 7.60e+07 0.0 2.9e+07 4.2e+02 1.1e+03 1 8 31 16 49 1 8 31 16 49 55945 Graph 24 1.0 3.9569e-01 1.0 4.80e+06 0.0 1.7e+06 4.5e+02 1.4e+02 0 0 2 1 6 0 0 2 1 7 17383 MIS/Agg 12 1.0 1.3554e-01 1.2 0.00e+00 0.0 1.7e+07 3.0e+02 2.5e+02 0 0 18 7 11 0 0 18 7 11 0 SA: col data 12 1.0 2.5961e-02 1.1 0.00e+00 0.0 1.1e+06 6.7e+02 4.8e+01 0 0 1 1 2 0 0 1 1 2 0 SA: frmProl0 12 1.0 3.9272e-01 1.0 0.00e+00 0.0 3.9e+05 2.7e+02 9.6e+01 0 0 0 0 4 0 0 0 0 4 0 SA: smooth 12 1.0 8.4174e-01 1.4 5.74e+06 0.0 3.3e+06 4.9e+02 1.7e+02 0 1 3 2 8 0 1 3 2 8 9996 GAMG: partLevel 12 1.0 1.5243e+00 1.0 6.99e+07 0.0 6.8e+06 3.2e+03 4.9e+02 1 7 7 28 22 1 7 7 28 22 65544 repartition 6 1.0 4.4372e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 0 0 0 0 2 0 0 0 0 2 0 Invert-Sort 6 1.0 2.8916e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 1 0 0 0 0 1 0 Move A 6 1.0 6.3890e-02 1.1 0.00e+00 0.0 1.2e+05 2.7e+02 1.0e+02 0 0 0 0 5 0 0 0 0 5 0 Move P 6 1.0 4.9374e-02 1.2 0.00e+00 0.0 1.5e+05 1.6e+01 1.0e+02 0 0 0 0 5 0 0 0 0 5 0 PCSetUp 6 1.0 3.6222e+00 1.0 1.46e+08 0.0 3.6e+07 9.5e+02 1.6e+03 1 15 38 44 74 1 15 38 44 75 58854 PCSetUpOnBlocks 36 1.0 2.1539e-02117.9 1.08e+02 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 36 1.0 1.9656e+00 5.3 7.80e+08 0.0 5.5e+07 5.9e+02 3.8e+02 1 79 59 43 17 1 79 59 43 17 575586 SFSetGraph 12 1.0 5.8748e-06 5.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFSetUp 12 1.0 5.2335e-02 2.0 0.00e+00 0.0 8.6e+05 4.5e+02 0.0e+00 0 0 1 1 0 0 0 1 1 0 0 SFBcastBegin 273 1.0 2.4409e-02186.6 0.00e+00 0.0 1.6e+07 2.9e+02 0.0e+00 0 0 17 6 0 0 0 17 6 0 0 SFBcastEnd 273 1.0 4.6278e-02920.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 573 573 1019832 0. Matrix 348 348 1351536 0. Matrix Coarsen 12 12 8208 0. Index Set 222 222 418080 0. Vec Scatter 81 81 113400 0. Krylov Solver 45 45 415944 0. Preconditioner 33 33 35448 0. Viewer 5 4 3584 0. PetscRandom 24 24 16656 0. Star Forest Graph 12 12 11328 0. ======================================================================================================================== Average time to get PetscTime(): 4.2282e-08 Average time for MPI_Barrier(): 1.84676e-05 Average time for zero size MPI_Send(): 1.59141e-06 #PETSc Option Table entries: --prefix run_a0b0c0d0e0f0g0h0i0_n7_l3 -aggrmeth alla_serial -beta 10.0 -betaest .true. -check .false. -datadt data_distribution_fully_assembled -dm 3 -dom -1.0 -in_space .true. -ksp_converged_reason -ksp_max_it 500 -ksp_monitor -ksp_norm_type unpreconditioned -ksp_rtol 1.0e-6 -ksp_type cg -ksp_view -l 1 -levelset popcorn -levelsettol 1.0e-6 -log_view -lsdom 0.0 -maxl 9 -mg_coarse_sub_pc_factor_mat_ordering_type nd -mg_coarse_sub_pc_type cholesky -mg_levels_esteig_ksp_type cg -no_signal_handler -nruns 3 -order 1 -pc_gamg_agg_nsmooths 1 -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 0 -pc_gamg_type agg -pc_type gamg -petscrc /gpfs/scratch/upc26/upc26229/par_cell_aggr_poisson/paper/weak_scal_ompi/2nd-w-scal/petscrc-0 -tt 1 -uagg .true. -wratio 10 -wsolution .false. #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 8 Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 -with-blaslapack-dir=/apps/INTEL/2017.4/mkl --with-debugging=0 --with-x=0 --with-shared-libraries=1 --with-mpi=1 --with-64-bit-indices ----------------------------------------- Libraries compiled on 2018-06-04 18:55:32 on login1 Machine characteristics: Linux-4.4.103-92.56-default-x86_64-with-SuSE-12-x86_64 Using PETSc directory: /gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0 Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -wd1572 -g -O3 Using Fortran compiler: mpif90 -fPIC -g -O3 ----------------------------------------- Using include paths: -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/include -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/apps/INTEL/2017.4/mkl/lib/intel64 -L/apps/INTEL/2017.4/mkl/lib/intel64 -Wl,-rpath,/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -L/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -Wl,-rpath,/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -L/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -Wl,-rpath,/usr/lib64/gcc/x86_64-suse-linux/4.8 -L/usr/lib64/gcc/x86_64-suse-linux/4.8 -Wl,-rpath,/usr/x86_64-suse-linux/lib -L/usr/x86_64-suse-linux/lib -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lifport -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lpthread -lgcc_s -lirc_s -lstdc++ -ldl ----------------------------------------- Ending run at Wed Nov 7 01:15:13 CET 2018 Ending script at Wed Nov 7 01:15:13 CET 2018 -------------- next part -------------- Linear solve converged due to CONVERGED_RTOL iterations 12 KSP Object: 16777 MPI processes type: cg maximum iterations=500, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 16777 MPI processes type: gamg type is MULTIPLICATIVE, levels=6 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 16777 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 16777 MPI processes type: bjacobi number of blocks = 16777 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=4, cols=4 package used to perform factorization: petsc total: nonzeros=10, allocated nonzeros=10 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 16777 MPI processes type: mpiaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 1 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 16777 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0999843, max = 1.09983 eigenvalues estimate via cg min 0.575611, max 0.999843 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 16777 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 16777 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 16777 MPI processes type: mpiaij rows=269, cols=269 total: nonzeros=46217, allocated nonzeros=46217 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 16777 MPI processes type: chebyshev eigenvalue estimates used: min = 0.18053, max = 1.98584 eigenvalues estimate via cg min 0.0637775, max 1.8053 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 16777 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 16777 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 16777 MPI processes type: mpiaij rows=18451, cols=18451 total: nonzeros=5470355, allocated nonzeros=5470355 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 16777 MPI processes type: chebyshev eigenvalue estimates used: min = 0.156694, max = 1.72364 eigenvalues estimate via cg min 0.0434381, max 1.56694 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 16777 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 16777 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 16777 MPI processes type: mpiaij rows=908791, cols=908791 total: nonzeros=191134331, allocated nonzeros=191134331 total number of mallocs used during MatSetValues calls =0 using scalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 16777 MPI processes type: chebyshev eigenvalue estimates used: min = 0.13616, max = 1.49776 eigenvalues estimate via cg min 0.0335059, max 1.3616 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_4_esteig_) 16777 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 16777 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 16777 MPI processes type: mpiaij rows=20910556, cols=20910556 total: nonzeros=1618051660, allocated nonzeros=1618051660 total number of mallocs used during MatSetValues calls =0 using scalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 5 ------------------------------- KSP Object: (mg_levels_5_) 16777 MPI processes type: chebyshev eigenvalue estimates used: min = 0.336389, max = 3.70028 eigenvalues estimate via cg min 0.0534122, max 3.36389 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_5_esteig_) 16777 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_5_) 16777 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 16777 MPI processes type: mpiaij rows=260421387, cols=260421387 total: nonzeros=7197955643, allocated nonzeros=156252832200 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 16777 MPI processes type: mpiaij rows=260421387, cols=260421387 total: nonzeros=7197955643, allocated nonzeros=156252832200 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /gpfs/scratch/upc26/upc26229/build_rel_fempar_cell_agg_ompi/FEMPAR/bin/par_test_h_adaptive_poisson_unfitted on a arch-linux2-c-opt named s02r2b25 with 16777 processors, by upc26229 Wed Nov 7 01:31:35 2018 Using Petsc Release Version 3.9.0, Apr, 07, 2018 Max Max/Min Avg Total Time (sec): 3.137e+02 1.00000 3.137e+02 Objects: 1.745e+03 1.00172 1.742e+03 Flop: 9.833e+08 0.00000 6.678e+08 1.120e+13 Flop/sec: 3.134e+06 0.00000 2.129e+06 3.571e+10 MPI Messages: 2.180e+05 19813.90909 5.157e+04 8.652e+08 MPI Message Lengths: 6.226e+07 1414906.81818 7.366e+02 6.374e+11 MPI Reductions: 3.011e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 3.1371e+02 100.0% 1.1204e+13 100.0% 8.652e+08 100.0% 7.366e+02 100.0% 2.998e+03 99.6% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSided 15 1.0 1.2635e-01 2.4 0.00e+00 0.0 2.3e+06 8.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSidedF 123 1.0 1.4200e+01 7.9 0.00e+00 0.0 9.0e+06 1.5e+04 0.0e+00 4 0 1 21 0 4 0 1 21 0 0 VecMDot 150 1.0 4.6170e-01 1.7 7.48e+06 0.0 0.0e+00 0.0e+00 1.5e+02 0 1 0 0 5 0 1 0 0 5 201728 VecTDot 384 1.0 3.1098e+00 4.0 5.80e+06 0.0 0.0e+00 0.0e+00 3.8e+02 0 1 0 0 13 0 1 0 0 13 23494 VecNorm 369 1.0 1.1910e+00 1.5 4.59e+06 0.0 0.0e+00 0.0e+00 3.7e+02 0 1 0 0 12 0 1 0 0 12 48340 VecScale 165 1.0 1.2617e-02212.0 7.49e+05 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 738275 VecCopy 231 1.0 1.0534e-02327.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 864 1.0 1.9701e-02189.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 387 1.0 4.0720e-02309.4 5.80e+06 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1794326 VecAYPX 1608 1.0 2.3452e-02168.1 1.07e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 5715622 VecAXPBYCZ 720 1.0 1.8540e-02276.4 1.63e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 10961725 VecMAXPY 165 1.0 1.5714e-02398.8 8.85e+06 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 7005508 VecAssemblyBegin 42 1.0 4.3601e-01 1.5 0.00e+00 0.0 1.0e+06 1.8e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 42 1.0 5.7755e-03134.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 165 1.0 2.8011e-03138.9 7.49e+05 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3325353 VecScatterBegin 1842 1.0 1.0872e-01510.7 0.00e+00 0.0 5.1e+08 6.4e+02 0.0e+00 0 0 59 51 0 0 0 59 51 0 0 VecScatterEnd 1842 1.0 2.2453e+0013155.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSetRandom 15 1.0 3.1115e-031343.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 165 1.0 5.2679e-01 1.3 2.25e+06 0.0 0.0e+00 0.0e+00 1.6e+02 0 0 0 0 5 0 0 0 0 6 53045 MatMult 1416 1.0 1.9601e+002744.6 4.82e+08 0.0 4.3e+08 7.2e+02 0.0e+00 0 48 50 48 0 0 48 50 48 0 2757975 MatMultAdd 180 1.0 7.8791e-019183.2 1.33e+07 0.0 3.3e+07 1.6e+02 0.0e+00 0 1 4 1 0 0 1 4 1 0 186808 MatMultTranspose 180 1.0 1.0899e+006594.6 1.33e+07 0.0 3.3e+07 1.6e+02 0.0e+00 0 1 4 1 0 0 1 4 1 0 135049 MatSolve 36 0.0 4.2692e-05 0.0 1.01e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 24 MatSOR 1245 1.0 1.0514e+007119.8 3.46e+08 0.0 0.0e+00 0.0e+00 0.0e+00 0 34 0 0 0 0 34 0 0 0 3644239 MatCholFctrSym 3 1.0 1.4223e-02755.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 3 1.0 8.4996e-033106.3 1.20e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 15 1.0 4.5698e-0228.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 45 1.0 1.0334e-011406.4 5.72e+06 0.0 4.5e+06 6.9e+02 0.0e+00 0 1 1 0 0 0 1 1 0 0 641987 MatResidual 180 1.0 3.4803e-012420.0 5.74e+07 0.0 5.4e+07 6.9e+02 0.0e+00 0 6 6 6 0 0 6 6 6 0 1864527 MatAssemblyBegin 306 1.0 1.3789e+01 5.9 0.00e+00 0.0 8.0e+06 1.7e+04 0.0e+00 4 0 1 21 0 4 0 1 21 0 0 MatAssemblyEnd 306 1.0 1.3870e+01 1.0 0.00e+00 0.0 4.6e+07 1.5e+02 6.5e+02 4 0 5 1 22 4 0 5 1 22 0 MatGetRow 204147 0.0 3.1561e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 3 0.0 1.3432e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 18 1.0 7.7521e+00 1.0 0.00e+00 0.0 1.4e+06 2.2e+02 2.8e+02 2 0 0 0 9 2 0 0 0 9 0 MatGetOrdering 3 0.0 1.4079e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 15 1.0 1.0908e+00 1.1 0.00e+00 0.0 2.6e+08 2.4e+02 5.3e+02 0 0 30 10 18 0 0 30 10 18 0 MatZeroEntries 15 1.0 2.3229e-03397.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 27 1.3 3.1142e-01 5.9 0.00e+00 0.0 0.0e+00 0.0e+00 2.1e+01 0 0 0 0 1 0 0 0 0 1 0 MatAXPY 15 1.0 2.4644e-01 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 15 1.0 6.6292e+00 1.0 4.79e+06 0.0 2.6e+07 4.9e+02 1.9e+02 2 0 3 2 6 2 0 3 2 6 8157 MatMatMultSym 15 1.0 6.2354e+00 1.0 0.00e+00 0.0 2.1e+07 4.5e+02 1.8e+02 2 0 2 2 6 2 0 2 2 6 0 MatMatMultNum 15 1.0 1.1974e-01 1.3 4.79e+06 0.0 4.5e+06 6.9e+02 0.0e+00 0 0 1 0 0 0 0 1 0 0 451599 MatPtAP 15 1.0 7.7822e+00 1.0 7.71e+07 0.0 5.5e+07 3.2e+03 2.3e+02 2 7 6 27 8 2 7 6 27 8 101415 MatPtAPSymbolic 15 1.0 4.5700e+00 1.0 0.00e+00 0.0 2.5e+07 3.7e+03 1.0e+02 1 0 3 15 3 1 0 3 15 4 0 MatPtAPNumeric 15 1.0 3.3127e+00 1.0 7.71e+07 0.0 2.9e+07 2.8e+03 1.2e+02 1 7 3 13 4 1 7 3 13 4 238246 MatGetLocalMat 45 1.0 1.9456e-0244.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 45 1.0 1.0016e-01150.0 0.00e+00 0.0 3.2e+07 2.5e+03 0.0e+00 0 0 4 12 0 0 0 4 12 0 0 KSPGMRESOrthog 150 1.0 4.6669e-01 1.7 1.50e+07 0.0 0.0e+00 0.0e+00 1.5e+02 0 2 0 0 5 0 2 0 0 5 399162 KSPSetUp 54 1.0 1.4842e+00 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 0 0 0 0 1 0 0 0 0 1 0 KSPSolve 3 1.0 3.9999e+00 1.1 8.39e+08 0.0 4.5e+08 6.4e+02 5.9e+02 1 85 52 45 20 1 85 52 45 20 2380078 PCGAMGGraph_AGG 15 1.0 6.2019e+00 1.0 4.79e+06 0.0 1.4e+07 4.6e+02 1.8e+02 2 0 2 1 6 2 0 2 1 6 8719 PCGAMGCoarse_AGG 15 1.0 1.0930e+00 1.1 0.00e+00 0.0 2.6e+08 2.4e+02 5.3e+02 0 0 30 10 18 0 0 30 10 18 0 PCGAMGProl_AGG 15 1.0 7.1075e+00 1.0 0.00e+00 0.0 1.2e+07 5.8e+02 2.4e+02 2 0 1 1 8 2 0 1 1 8 0 PCGAMGPOpt_AGG 15 1.0 1.1478e+01 1.0 7.11e+07 0.0 7.1e+07 6.2e+02 6.2e+02 4 8 8 7 21 4 8 8 7 21 73252 GAMG: createProl 15 1.0 2.5825e+01 1.0 7.59e+07 0.0 3.5e+08 3.4e+02 1.6e+03 8 8 41 19 52 8 8 41 19 53 34652 Graph 30 1.0 6.1862e+00 1.0 4.79e+06 0.0 1.4e+07 4.6e+02 1.8e+02 2 0 2 1 6 2 0 2 1 6 8741 MIS/Agg 15 1.0 1.0910e+00 1.1 0.00e+00 0.0 2.6e+08 2.4e+02 5.3e+02 0 0 30 10 18 0 0 30 10 18 0 SA: col data 15 1.0 2.3531e+00 1.0 0.00e+00 0.0 9.0e+06 6.9e+02 6.0e+01 1 0 1 1 2 1 0 1 1 2 0 SA: frmProl0 15 1.0 2.8294e+00 1.0 0.00e+00 0.0 3.2e+06 2.7e+02 1.2e+02 1 0 0 0 4 1 0 0 0 4 0 SA: smooth 15 1.0 7.7540e+00 1.0 5.72e+06 0.0 2.6e+07 4.9e+02 2.2e+02 2 1 3 2 7 2 1 3 2 7 8556 GAMG: partLevel 15 1.0 1.9884e+01 1.0 7.71e+07 0.0 5.6e+07 3.1e+03 6.8e+02 6 7 6 28 23 6 7 6 28 23 39691 repartition 9 1.0 1.2793e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.4e+01 0 0 0 0 2 0 0 0 0 2 0 Invert-Sort 9 1.0 1.6471e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 1 0 0 0 1 1 0 0 0 1 0 Move A 9 1.0 3.9110e+00 1.0 0.00e+00 0.0 5.1e+05 5.6e+02 1.5e+02 1 0 0 0 5 1 0 0 0 5 0 Move P 9 1.0 3.9752e+00 1.0 0.00e+00 0.0 8.8e+05 2.1e+01 1.5e+02 1 0 0 0 5 1 0 0 0 5 0 PCSetUp 6 1.0 4.7888e+01 1.0 1.51e+08 0.0 4.1e+08 7.2e+02 2.3e+03 15 15 47 46 78 15 15 47 46 78 35168 PCSetUpOnBlocks 36 1.0 2.2114e-0248.6 1.20e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 36 1.0 3.2792e+00 2.8 7.84e+08 0.0 4.4e+08 6.1e+02 4.8e+02 1 79 50 41 16 1 79 50 41 16 2713702 SFSetGraph 15 1.0 1.1425e-05 9.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFSetUp 15 1.0 1.4317e-01 1.9 0.00e+00 0.0 6.8e+06 4.6e+02 0.0e+00 0 0 1 0 0 0 0 1 0 0 0 SFBcastBegin 564 1.0 3.8685e-02183.9 0.00e+00 0.0 2.5e+08 2.4e+02 0.0e+00 0 0 29 9 0 0 0 29 9 0 0 SFBcastEnd 564 1.0 3.0283e-013088.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 735 735 1523448 0. Matrix 444 444 11007960 0. Matrix Coarsen 15 15 10260 0. Index Set 303 303 2069856 0. Vec Scatter 105 105 147192 0. Krylov Solver 54 54 516960 0. Preconditioner 39 39 41352 0. Viewer 5 4 3584 0. PetscRandom 30 30 20820 0. Star Forest Graph 15 15 14160 0. ======================================================================================================================== Average time to get PetscTime(): 4.25614e-08 Average time for MPI_Barrier(): 0.00116431 Average time for zero size MPI_Send(): 1.79813e-06 #PETSc Option Table entries: --prefix run_a0b0c0d0e0f0g0h0i0_n8_l3 -aggrmeth alla_serial -beta 10.0 -betaest .true. -check .false. -datadt data_distribution_fully_assembled -dm 3 -dom -1.0 -in_space .true. -ksp_converged_reason -ksp_max_it 500 -ksp_monitor -ksp_norm_type unpreconditioned -ksp_rtol 1.0e-6 -ksp_type cg -ksp_view -l 1 -levelset popcorn -levelsettol 1.0e-6 -log_view -lsdom 0.0 -maxl 10 -mg_coarse_sub_pc_factor_mat_ordering_type nd -mg_coarse_sub_pc_type cholesky -mg_levels_esteig_ksp_type cg -no_signal_handler -nruns 3 -order 1 -pc_gamg_agg_nsmooths 1 -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 0 -pc_gamg_type agg -pc_type gamg -petscrc /gpfs/scratch/upc26/upc26229/par_cell_aggr_poisson/paper/weak_scal_ompi/2nd-w-scal/petscrc-0 -tt 1 -uagg .true. -wratio 10 -wsolution .false. #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 8 Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 -with-blaslapack-dir=/apps/INTEL/2017.4/mkl --with-debugging=0 --with-x=0 --with-shared-libraries=1 --with-mpi=1 --with-64-bit-indices ----------------------------------------- Libraries compiled on 2018-06-04 18:55:32 on login1 Machine characteristics: Linux-4.4.103-92.56-default-x86_64-with-SuSE-12-x86_64 Using PETSc directory: /gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0 Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -wd1572 -g -O3 Using Fortran compiler: mpif90 -fPIC -g -O3 ----------------------------------------- Using include paths: -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/include -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/apps/INTEL/2017.4/mkl/lib/intel64 -L/apps/INTEL/2017.4/mkl/lib/intel64 -Wl,-rpath,/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -L/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -Wl,-rpath,/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -L/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -Wl,-rpath,/usr/lib64/gcc/x86_64-suse-linux/4.8 -L/usr/lib64/gcc/x86_64-suse-linux/4.8 -Wl,-rpath,/usr/x86_64-suse-linux/lib -L/usr/x86_64-suse-linux/lib -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lifport -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lpthread -lgcc_s -lirc_s -lstdc++ -ldl ----------------------------------------- Ending run at Wed Nov 7 01:31:37 CET 2018 Ending script at Wed Nov 7 01:31:37 CET 2018 From mfadams at lbl.gov Wed Nov 7 10:01:54 2018 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 7 Nov 2018 11:01:54 -0500 Subject: [petsc-users] Problems about PCtype bjacobi In-Reply-To: References: Message-ID: On Wed, Nov 7, 2018 at 10:16 AM Yingjie Wu via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear Petsc developer: > Hi, > Recently, I'm solving the problems of nonlinear systems of PDEs, I > encountered some problems about precondition and wanted to seek help. > > 1.I set the precondition matrix in SNES as MPIAIJ in the program, and then > use Matrix Free method to solve my problem. The log information of the > program is as follows: > > SNES Object: 1 MPI processes > type: newtonls > maximum iterations=50, maximum function evaluations=100000000 > tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 > total number of linear solver iterations=177 > total number of function evaluations=371 > norm schedule ALWAYS > SNESLineSearch Object: 1 MPI processes > type: bt > interpolation: cubic > alpha=1.000000e-04 > maxstep=1.000000e+08, minlambda=1.000000e-12 > tolerances: relative=1.000000e-08, absolute=1.000000e-15, > lambda=1.000000e-08 > maximum iterations=40 > KSP Object: 1 MPI processes > type: gmres > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: bjacobi > number of blocks = 1 > Local solve is same for all blocks, in the following KSP and PC > objects: > KSP Object: (sub_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (sub_) 1 MPI processes > type: bjacobi > number of blocks = 1 > Local solve is same for all blocks, in the following KSP and PC > objects: > KSP Object: (sub_sub_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (sub_sub_) 1 MPI processes > type: ilu > out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=961, cols=961 > package used to perform factorization: petsc > total: nonzeros=4129, allocated nonzeros=4129 > total number of mallocs used during MatSetValues calls > =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: 1 MPI processes > type: seqaij > rows=961, cols=961 > total: nonzeros=4129, allocated nonzeros=4805 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: 1 MPI processes > type: mpiaij > rows=961, cols=961 > total: nonzeros=4129, allocated nonzeros=9610 > total number of mallocs used during MatSetValues calls =0 > not using I-node (on process 0) routines > linear system matrix followed by preconditioner matrix: > Mat Object: 1 MPI processes > type: mffd > rows=961, cols=961 > Matrix-free approximation: > err=1.49012e-08 (relative error in function evaluation) > Using wp compute h routine > Does not compute normU > Mat Object: 1 MPI processes > type: mpiaij > rows=961, cols=961 > total: nonzeros=4129, allocated nonzeros=9610 > total number of mallocs used during MatSetValues calls =0 > not using I-node (on process 0) routines > > Although parallel matrix is used, it runs on a single processor. Because > of the use of parallel matrices, the overall precondition scheme should be > bjacobi, and then build a KSP for each block (there is only one block in my > program). Therefore, there will be a sub KSP object in the PC information. > But in the above information, a subsubksp is also embedded in the sub KSP > object. I don't understand the reason for this KSP. Please help me answer. > It looks like you are setting the sub PC type to bjacobi and it has a sub (sub) PC. > > 2. Bjacobi is a precondition method in theory. Why is there a subsystem of > linear equations solver object ksp? In my understanding, precondition is a > matrix decomposition method, and there is no linear equations?. Therefore, > there should be no subKSP here. > > 3.when I assemble precondition matrix, I used MatSetVales. > > idleft = row - 1 ; > > idright = row + 1 ; > > idup = row + 20; > > iddown = row - 20; > > > > /*phi1 field*/ > > v[1] = - 1.0 / ( dx * dx / (2* 1.267) + dx * dx / (2* 1.267 ) ) ; > col[1] = idleft; > > v[2] = - 1.0 / ( dx * dx / (2* 1.267) + dx * dx / (2* 1.267 ) ) ; > col[2] = idright; > > v[3] = - 1.0/ ( dy * dy / (2* 1.267) + dy * dy / (2* 1.267 ) ) ; > col[3] = iddown; > > v[4] = - 1.0 / ( dy * dy / (2* 1.267) + dy * dy / (2* 1.267 ) ) ; > col[4] = idup; > > v[0] = v[1] + v[2] + v[3] + v[4] + v[0] ; col[0] = row; > > ierr = MatSetValues(B,1,&row,5,col,v,INSERT_VALUES);CHKERRQ(ierr); > > There must be junk in "col" somehow. I don't see anything wrong with this code. Print out the values of "col" before the MatSetValues call. > The wrong information is: > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Argument out of range > [0]PETSC ERROR: Column too large: col 1077215232 max 960 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.1, Sep, 26, 2018 > [0]PETSC ERROR: ./ex217 on a arch-linux2-c-debug named yjwu-XPS-8910 by > yjwu Wed Nov 7 09:52:22 2018 > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-mpich --download-fblaslapack > [0]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 442 in > /home/yjwu/petsc-3.10.1/src/mat/impls/aij/seq/aij.c > [0]PETSC ERROR: #2 MatSetValues() line 1349 in > /home/yjwu/petsc-3.10.1/src/mat/interface/matrix.c > [0]PETSC ERROR: #3 FormJacobian() line 272 in > /home/yjwu/petsc-3.10.1/src/snes/examples/tutorials/ex217.c > [0]PETSC ERROR: #4 SNESComputeJacobian() line 2555 in > /home/yjwu/petsc-3.10.1/src/snes/interface/snes.c > [0]PETSC ERROR: #5 SNESSolve_NEWTONLS() line 222 in > /home/yjwu/petsc-3.10.1/src/snes/impls/ls/ls.c > [0]PETSC ERROR: #6 SNESSolve() line 4396 in > /home/yjwu/petsc-3.10.1/src/snes/interface/snes.c > [0]PETSC ERROR: #7 main() line 108 in > /home/yjwu/petsc-3.10.1/src/snes/examples/tutorials/ex217.c > > This seems to be a very obvious question, but I don't know how to solve > it. > > Thanks for your continuous help, > Yingjie > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Nov 7 10:07:57 2018 From: jed at jedbrown.org (Jed Brown) Date: Wed, 07 Nov 2018 09:07:57 -0700 Subject: [petsc-users] Vec, Mat and binaryfiles. In-Reply-To: References: <87r2fzpj0m.fsf@jedbrown.org> Message-ID: <87muqkna5e.fsf@jedbrown.org> Please always use "reply-all" so that your messages go to the list. This is standard mailing list etiquette. It is important to preserve threading for people who find this discussion later and so that we do not waste our time re-answering the same questions that have already been answered in private side-conversations. You'll likely get an answer faster that way too. Sal Am writes: > Thank you Jed for the quick response! > >> >> Yes, of course the formats would have to match. I would recommend >> writing the files in an existing format such as PETSc's binary format. >> > > Unfortunately I do not think I can change the source code to output those > two files in PETSc format and it would probably take a very long time > converting everything into PETSc (it is not even my code). File-based workflows are very often bottlenecks. You can also use any convenient software (e.g., Python or Matlab/Octave) to convert your custom binary formats to PETSc binary format (see PetscBinaryIO provided with PETSc), at which point you'll be able to read in parallel. If you don't care about scalability or only need to read once, then you can write code of the type you propose. > Do you have any other suggestions on how to read in those two complex > binary files? Also why would it be more difficult to parallelise as > I thought getting the two files in PETSc vector format would allow me to > use the rest of PETSc library the usual way? > > Kind regards, > S > > > On Mon, Nov 5, 2018 at 4:49 PM Jed Brown wrote: > >> Sal Am via petsc-users writes: >> >> > Hi, >> > >> > I am trying to solve a Ax=b complex system. the vector b and "matrix" A >> are >> > both binary and NOT created by PETSc. So I keep getting error messages >> that >> > they are not correct format when I read the files with >> PetscViewBinaryOpen, >> > after some digging it seems that one cannot just read a binary file that >> > was created by another software. >> >> Yes, of course the formats would have to match. I would recommend >> writing the files in an existing format such as PETSc's binary format. >> While the method you describe can be made to work, it will be more work >> to make it parallel. >> >> > How would I go on to solve this problem? >> > >> > More info and trials: >> > >> > "matrix" A consists of two files, one that contains row column index >> > numbers and one that contains the non-zero values. So what I would have >> to >> > do is multiply the last term in a+b with PETSC_i to get a real + >> imaginary >> > vector A. >> > >> > vector b is in binary, so what I have done so far (not sure if it works) >> is: >> > >> > std::ifstream input("Vector_b.bin", std::ios::binary ); >> > while (input.read(reinterpret_cast(&v), sizeof(float))) >> > ierr = VecSetValues(u,1,&iglobal,&v,INSERT_VALUES);CHKERRQ(ierr); >> > >> > where v is a PetscScalar. >> > >> > Once I am able to read both matrices I think I can figure out the solvers >> > to solve the system. >> > >> > All the best, >> > S >> From mfadams at lbl.gov Wed Nov 7 10:59:23 2018 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 7 Nov 2018 11:59:23 -0500 Subject: [petsc-users] [petsc-maint] Correct use of PCFactorSetMatOrderingType In-Reply-To: References: Message-ID: please respond to petsc-users. You are doing 5 solves here in 14 seconds. You seem to be saying that the two pressure solves are taking all of this time. I don't know why the two solves are different. You seem to be saying that OpenFOAM solves the problem in 10 seconds and PETSc solves it in 14 seconds. Is that correct? Hypre seems to be running fine. On Wed, Nov 7, 2018 at 11:24 AM Edoardo alinovi wrote: > Thanks a lot Mark for your kind replay. The solver is mine and I use > PETSc for the solution of momentum and pressure. The first is solved very > fast by a standard bcgs + bjacobi, but the pressure is the source of all > evils and, unfortunately, I am pretty sure that almost all the time within > the time-step is needed by KSP to solve the pressure (see log attached). I > have verified this also putting a couple of mpi_wtime around the kspsolve > call. The pressure is solved 2 times (1 prediction + 1 correction), the > prediction takes around 11s , the correction around 4s (here I am avoiding > to recompute the preconditioner), all the rest of the code (flux assembling > + mometum solution + others) around 1s. Openfoam does the same procedure > with the same tolerance in 10s using its gamg version (50 it to converge). > The number of iteration required to solve the pressure with hypre are 12. > Gamg performs similarly to hypre in terms of speed, but with 50 iterations > to converge. Am I missing something in the setup in your opinion? > > thanks a lot, > > Edo > > ------ > > Edoardo Alinovi, Ph.D. > > DICCA, Scuola Politecnica > Universita' di Genova > 1, via Montallegro > 16145 Genova, Italy > > email: edoardo.alinovi at dicca.unige.it > Tel: +39 010 353 2540 > > > > > Il giorno mer 7 nov 2018 alle ore 16:50 Mark Adams ha > scritto: > >> You can try -pc_type gamg, but hypre is a pretty good solver for the >> Laplacian. If hypre is just a little faster than LU on a 3D problem (that >> takes 10 seconds to solve) then AMG is not doing well. I would expect that >> AMG is taking a lot of iterations (eg, >> 10). You can check that with >> -ksp_monitor. >> >> The PISO algorithm is a multistage algorithm with a pressure correction >> in it. It also has a solve for the velocity, from what I can tell. Are you >> building PISO yourself and using PETSc just for the pressure correction? >> Are you sure the time is spent in this solver? You can use -log_view to see >> performance numbers and look for KSPSolve to see how much time is spent in >> the PETSc solver. >> >> Mark >> >> >> On Wed, Nov 7, 2018 at 10:26 AM Zhang, Hong via petsc-maint < >> petsc-maint at mcs.anl.gov> wrote: >> >>> Edoardo: >>> Forwarding your request to petsc-maint where you can get fast and expert >>> advise. I do not have suggestion for your application, but someone in our >>> team likely will make suggestion. >>> Hong >>> >>> Hello Hong, >>>> >>>> Well, using -sub_pc_type lu it super slow. I am desperately triying >>>> to enhance performaces of my code (CFD, finite volume, PISO alghoritm), in >>>> particular I have a strong bottleneck in the solution of pressure >>>> correction equation which takes almost the 90% of computational time. Using >>>> multigrid as preconditoner (hypre with default options) is slighlty >>>> better, but comparing the results against the multigrid used in openFOAM, >>>> my code is losing 10s/iteration which a huge amount of time. Now, since >>>> that all the time is employed by KSPSolve, I feel a bit powerless. Do you >>>> have any helpful advice? >>>> >>>> Thank you very much! >>>> ------ >>>> >>>> Edoardo Alinovi, Ph.D. >>>> >>>> DICCA, Scuola Politecnica >>>> Universita' di Genova >>>> 1, via Montallegro >>>> 16145 Genova, Italy >>>> >>>> email: edoardo.alinovi at dicca.unige.it >>>> Tel: +39 010 353 2540 >>>> >>>> >>>> >>>> >>>> Il giorno mar 6 nov 2018 alle ore 17:15 Zhang, Hong >>>> ha scritto: >>>> >>>>> Edoardo: >>>>> Interesting. I thought it would not affect performance much. What >>>>> happens if you use -sub_pc_type lu'? >>>>> Hong >>>>> >>>>> Dear Hong and Matt, >>>>>> >>>>>> thank you for your kind replay. I have just tested your suggestions >>>>>> and applied " -sub_pc_type ilu -sub_pc_factor_mat_ordering_type nd/rcm" >>>>>> and, in both cases, I have found a deterioration of performances >>>>>> with respect to doing nothing (thus just putting default PCBJACOBI). Is it >>>>>> normal? However, I guess this is very problem dependent. >>>>>> ------ >>>>>> >>>>>> Edoardo Alinovi, Ph.D. >>>>>> >>>>>> DICCA, Scuola Politecnica >>>>>> Universita' di Genova >>>>>> 1, via Montallegro >>>>>> 16145 Genova, Italy >>>>>> >>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>> Tel: +39 010 353 2540 >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> Il giorno mar 6 nov 2018 alle ore 16:04 Zhang, Hong < >>>>>> hzhang at mcs.anl.gov> ha scritto: >>>>>> >>>>>>> Edoardo: >>>>>>> You can test runtime option '-sub_pc_factor_mat_ordering_type' and >>>>>>> use '-log_view' to get performance on different orderings, >>>>>>> e.g.,petsc/src/ksp/ksp/examples/tutorials/ex2.c: >>>>>>> mpiexec -n 2 ./ex2 -ksp_view -sub_pc_type ilu >>>>>>> -sub_pc_factor_mat_ordering_type nd >>>>>>> >>>>>>> I do not think the ordering inside block for ilu would affect >>>>>>> performance much. Let us know what you will get. >>>>>>> Hong >>>>>>> >>>>>>> Dear users, >>>>>>>> >>>>>>>> I have a question about the correct use of the option >>>>>>>> "PCFactorSetMatOrderingType" in PETSc. >>>>>>>> >>>>>>>> I am solving a problem with 2.5M of unknowns distributed along 16 >>>>>>>> processors and I am using the block jacobi preconditioner and MPIAIJ >>>>>>>> matrix format. I cannot figure out if the above option can be useful or not >>>>>>>> in decreasing the computational time. Any suggestion or tips? >>>>>>>> >>>>>>>> Thank you very much for the kind help >>>>>>>> >>>>>>>> ------ >>>>>>>> >>>>>>>> Edoardo Alinovi, Ph.D. >>>>>>>> >>>>>>>> DICCA, Scuola Politecnica >>>>>>>> Universita' di Genova >>>>>>>> 1, via Montallegro >>>>>>>> 16145 Genova, Italy >>>>>>>> >>>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>>> Tel: +39 010 353 2540 >>>>>>>> >>>>>>>> >>>>>>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Nov 7 12:46:30 2018 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 7 Nov 2018 13:46:30 -0500 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue In-Reply-To: <5BE30C87.10204@cimne.upc.edu> References: <5BE30C87.10204@cimne.upc.edu> Message-ID: First I would add -gamg_est_ksp_type cg You seem to be converging well so I assume you are setting the null space for GAMG. Note, you should test hypre also. You probably want a bigger "-pc_gamg_process_eq_limit 50". 200 at least but you test your machine with a range on the largest problem. This is a parameter for reducing the number of active processors (on coarse grids). I would only worry about "load3". This has 16K equations per process, which is where you start noticing "strong scaling" problems, depending on the machine. An important parameter is "-pc_gamg_square_graph 0". I would probably start with infinity (eg, 10). Now, I'm not sure about your domain, problem sizes, and thus the weak scaling design. You seem to be scaling on the background mesh, but that may not be a good proxy for complexity. You can look at the number of flops and scale it appropriately by the number of solver iterations to get a relative size of the problem. I would recommend scaling the number of processors with this. For instance here the MatMult line for the 4 proc and 16K proc run: ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ MatMult 636 1.0 1.9035e-01 1.0 3.12e+08 1.1 7.6e+03 3.0e+03 0.0e+00 0 47 62 44 0 0 47 62 44 0 6275 [2 procs] MatMult 1416 1.0 1.9601e+002744.6 4.82e+08 0.0 4.3e+08 7.2e+02 0.0e+00 0 48 50 48 0 0 48 50 48 0 2757975 [16K procs] Now, you have empty processors. See the massive load imbalance on time and the zero on Flops. The "Ratio" is max/min and cleary min=0 so PETSc reports a ratio of 0 (it is infinity really). Also, weak scaling on a thin body (I don't know your domain) is a little funny because as the problem scales up the mesh becomes more 3D and this causes the cost per equation to go up. That is why I prefer to use the number of non-zeros as the processor scaling function but number of equations is easier ... The PC setup times are large (I see 48 seconds at 16K bu you report 16). -pc_gamg_square_graph 10 should help that. The max number of flops per processor in MatMult goes up by 50% and the max time goes up by 10x and the number of iterations goes up by 13/8. If I put all of this together I get that 75% of the time at 16K is in communication at 16K. I think that and the absolute time can be improved some by optimizing parameters as I've suggested. Mark On Wed, Nov 7, 2018 at 11:03 AM "Alberto F. Mart?n" via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear All, > > we are performing a weak scaling test of the PETSc (v3.9.0) GAMG > preconditioner when applied to the linear system arising > from the *conforming unfitted FE discretization *(using Q1 Lagrangian > FEs) of a 3D PDE Poisson problem, where > the boundary of the domain (a popcorn flake) is described as a > zero-level-set embedded within a uniform background > (Cartesian-like) hexahedral mesh. Details underlying the FEM formulation > can be made available on demand if you > believe that this might be helpful, but let me just point out that it is > designed such that it addresses the well-known > ill-conditioning issues of unfitted FE discretizations due to the small > cut cell problem. > > The weak scaling test is set up as follows. We start from a single cube > background mesh, and refine it uniformly several > steps, until we have approximately either 10**3 (load1), 20**3 (load2), or > 40**3 (load3) hexahedra/MPI task when > distributing it over 4 MPI tasks. The benchmark is scaled such that the > next larger scale problem to be tested is obtained > by uniformly refining the mesh from the previous scale and running it on > 8x times the number of MPI tasks that we used > in the previous scale. As a result, we obtain three weak scaling curves > for each of the three fixed loads per MPI task > above, on the following total number of MPI tasks: 4, 32, 262, 2097, > 16777. The underlying mesh is not partitioned among > MPI tasks using ParMETIS (unstructured multilevel graph partitioning) nor > optimally by hand, but following the so-called > z-shape space-filling curves provided by an underlying octree-like mesh > handler (i.e., p4est library). > > I configured the preconditioned linear solver as follows: > > -ksp_type cg > -ksp_monitor > -ksp_rtol 1.0e-6 > -ksp_converged_reason > -ksp_max_it 500 > -ksp_norm_type unpreconditioned > -ksp_view > -log_view > > -pc_type gamg > -pc_gamg_type agg > -mg_levels_esteig_ksp_type cg > -mg_coarse_sub_pc_type cholesky > -mg_coarse_sub_pc_factor_mat_ordering_type nd > -pc_gamg_process_eq_limit 50 > -pc_gamg_square_graph 0 > -pc_gamg_agg_nsmooths 1 > > Raw timings (in seconds) of the preconditioner set up and PCG iterative > solution stage, and number of iterations are as follows: > > **preconditioner set up** > (load1): [0.02542160451, 0.05169247743, 0.09266782179, 0.2426272957, > 13.64161944] > (load2): [0.1239175797 , 0.1885528499 , 0.2719282564 , 0.4783878336, > 13.37947339] > (load3): [0.6565349903 , 0.9435049873 , 1.299908397 , 1.916243652 , > 16.02904088] > > **PCG stage** > (load1): [0.003287350759, 0.008163803257, 0.03565631993, 0.08343045413, > 0.6937994603] > (load2): [0.0205939794 , 0.03594723623 , 0.07593298424, 0.1212046621 > , 0.6780373845] > (load3): [0.1310882876 , 0.3214917686 , 0.5532023879 , > 0.766881627 , 1.485446003] > > **number of PCG iterations** > (load1): [5, 8, 11, 13, 13] > (load2): [7, 10, 12, 13, 13] > (load3): [8, 10, 12, 13, 13] > > It can be observed that both the number of linear solver iterations and > the PCG stage timings (weakly) > scale remarkably, but t*here is a significant time increase when scaling > the problem from 2097 to 16777 MPI tasks * > *for the preconditioner setup stage* (e.g., 1.916243652 vs 16.02904088 > sec. with 40**3 cells per MPI task). > I gathered the combined output of -ksp_view and -log_view (only) for all > the points involving the load3 weak scaling > test (find them attached to this message). Please note that within each > run, I execute the these two stages up-to > three times, and this influences absolute timings given in -log_view. > > Looking at the output of -log_view, it is very strange to me, e.g., that > the stage labelled as "Graph" > does not scale properly as it is just a call to MatDuplicate if the block > size of the matrix is 1 (our case), and > I guess that it is just a local operation that does not require any > communication. > What I am missing here? The load does not seem to be unbalanced looking at > the "Ratio" column. > > I wonder whether the observed behaviour is as expected, or this a > miss-configuration of the solver from our side. > I played (quite a lot) with several parameter-value combinations, and the > configuration above is the one that led to fastest > execution (from the ones tested, that might be incomplete, I can also > provide further feedback if helpful). > Any feedback that we can get from your experience in order to find the > cause(s) of this issue and a mitigating solution > will be of high added value. > > Thanks very much in advance! > Best regards, > Alberto. > > -- > Alberto F. Mart?n-Huertas > Senior Researcher, PhD. Computational Science > Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) > Parc Mediterrani de la Tecnologia, UPC > Esteve Terradas 5, Building C3, Office 215, > 08860 Castelldefels (Barcelona, Spain) > Tel.: (+34) 9341 34223e-mail:amartin at cimne.upc.edu > > FEMPAR project co-founder > web: http://www.fempar.org > > ________________ > IMPORTANT NOTICE > All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in > order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by > letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Nov 7 13:00:05 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Wed, 7 Nov 2018 19:00:05 +0000 Subject: [petsc-users] PetscBarrier from fortran In-Reply-To: References: Message-ID: <8AC4EEA1-228F-44DA-A425-EB99B072B1BB@anl.gov> This comes from the new type checking for Fortran code. I suggest you just call MPI_Barrier() using MPI_COMM_WORLD from Fortran. Barry > On Nov 6, 2018, at 11:37 PM, Marius Buerkle via petsc-users wrote: > > Hi > > When calling PetscBarrier from fortran using "call PetscBarrier(PETSC_NULL_MAT,ierr)" with latest petsc version 3.10.2, I get the following error "Error: Type mismatch in argument ?a? at (1); passed TYPE(tmat) to INTEGER(8)". I compiled petsc not with integer(8). It work with previous versions. > > best, > Marius From dave.mayhem23 at gmail.com Wed Nov 7 17:25:03 2018 From: dave.mayhem23 at gmail.com (Dave May) Date: Wed, 7 Nov 2018 23:25:03 +0000 Subject: [petsc-users] need help with vector interpolation on nonuniform DMDA grids In-Reply-To: References: <27356998-760A-431D-BB28-9FA41CA0ED40@fastwebnet.it> <669D57CD-9870-4A79-94C9-5258442FEF4E@fastwebnet.it> Message-ID: On Tue, 6 Nov 2018 at 16:01, Matthew Knepley wrote: > On Tue, Nov 6, 2018 at 9:45 AM Francesco Magaletti < > francesco_magaletti at fastwebnet.it> wrote: > >> Hi Matt, >> thanks for the reply, but I didn?t catch exactly your point. What do you >> mean with ?first bin the points into cells, and then use the >> interpolant from the discretization? ? To me it sounds like my ?naive? >> approach, but maybe I missed something in your suggestion. >> > > I did not read the whole mail before. Your naive approach is exactly > right. However, in parallel, you need to keep track of the bounds of each > process, send the points to the right process, and then do local location > and interpolation. It is some bookkeeping. > > Explaining this gave me another idea. I think that DMSwarm might do this > automatically. You create a DMSwarm, stick in > all the points you need to interpolate at, set the cellDM to the DMDA used > for interpolation. Then you call DMSwarmMigrate() > to put the points on the correct process, and then interpolate. Tell me if > this does not work. I am Cc'ing the author Dave May > in case I have made an error. > It will work using dmswarm but not exactly as Matt describes (although I might have missed some details in your naive approach). Here is why I think Matts exact definition is not quite what you want. When you set the dmda on the swarm object, the swarm uses it for point location and selects a particular communication pattern for the data exchange. The point location indicates if a point was contained in the sub-domain. If the answer is no, the point is scattered to all ranks which are neighbours of the current sub-domain. The point location does not identify which sub-domain contains each point. Hence I think this comm pattern may not be sufficient for what you want to do (in general) The alternative approach which will work is to not attach the dmda to the swarm. In this case, you are responsible to determine where the point should go and assign the target rank value to an internal swarm variable (textual name is "DMSwarm_rank"). In this mode you define the communication pattern, cf. inferring it from the dmda decomposition. Here is what I would do: * I'll assume you want to interpolate from dmda2 to dmda1 and you have fields Vec F2 and Vec F1 defined on dmda2 and dmda1. * I'll assume that you ultimately want the interpolated field values to live in F1. If that is not the case the procedure described below will be slightly different. * Below is some pseudo code. I'll write things swarm[i]->"xxx" to indicate registered entires in the swam named "xxx" which live in the i-th slot of the dmswarm's internal storage. [1] Create a swarm and register the following fields "coor" (PetscReal) "dmda_index" (PetscInt) [2] Get the sub-domain bounding box from dmda2 and broadcast these to all ranks, so that every rank knows the bounds of all sub-domains associated with dmda2 [3] Set the local size of the swarm to match the number of locally owned points in dmda1. Do not include the ghost points. Copy all the coordinates from dmda1 into the registered swarm field "coor". Copy the global index of each point in dmda1 into the registered field "dmda_index" [4] Get the internally registered dmswarm field named "DMSwarm_rank". for all points in the swarm, i swarm[i]->"DMSwarm_rank" = comm.rank for all bounding boxes of dmda2, b[k] if swarm[i]->"coor" is contained in b[k] swarm[i]->"DMSwarm_rank" = k break [5] Call DMSwarmMigrate(swarm,PETSC_TRUE) and you data will get shipped to where you indicated it should go. Now all the points you want to interpolate to should be located within the correct sub-domains of dmda2. Note we will remove the sent points from the local dmswarm storage [6] Get the registered swarm field "coor" and "dmda_index" VecZeroEntries(F1) for all points in the swarm, i locate which cell in dmda2 contains swarm[i]->"coor" /* interpolate F2 defined on dmda2 to swarm[i]->"coor" - call this F_interp */ F_interp = ..... VecSetValue(F1,swarm[i]->"dmda_index",F_interp,INSERT_VALUES); VecAssemblyBegin(F1) VecAssemblyBegin(F2) Hope this helps. Thanks, Dave > Thanks, > > Matt > > >> My problem comes from the solution of a 1D time evolving pde coupled with >> a hand-made mesh adapting algorithm. Therefore I?m already using a DMDA to >> manage all the communication issues among processors. Every n timesteps I >> move the grid and then I need to evaluate the old solution on the new grid >> points, this is why I need an efficient way to do it. Is it feasible to let >> DMDA?s objects speak with DMPlexes, in order to keep the previous code >> structure unaltered and use DMPlex only for the interpolation routine? >> >> Thanks >> >> >> >> Il giorno 06/nov/2018, alle ore 15:14, Matthew Knepley >> ha scritto: >> >> On Tue, Nov 6, 2018 at 5:11 AM Francesco Magaletti via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Dear all, >>> >>> I would like to ask you if is there an easy and efficient way in >>> parallel (by using PetsC functions) to interpolate a DMDA vector associated >>> with a nonuniform 1D grid to another DMDA vector with the same length but >>> associated with a different nonuniform grid. >>> >>> Let me rephrase it to be as clearer as I can: >>> >>> I have two structured nonuniform 1D grids with coordinate vectors x[i] >>> and y[i]. Both the domains have been discretized with the same number of >>> points, but the coordinate vectors x and y are different. I have a >>> discretized field u[i] = u(x[i]) and I would like to use these point values >>> to evaluate the values u(y[i]) in the points of the second grid. >>> >>> I read on the manual pages that functions like DMCreateInterpolation or >>> similar work only with different but uniform DMDAs. Did I understand >>> correctly? >>> >>> A naive approach, with a serial code, could be to find the points x[i] >>> and x[i+1] that surround the point y[j] for every j and then simply linear >>> interpolating the values u[i] and u[i+1]. I suspect that this is not the >>> most efficient way to do it. Moreover it won?t work in parallel since, in >>> principle, I do not know beforehand how many ghost nodes could be necessary >>> to perform all the interpolations. >>> >>> Thank you in advance for your help! >>> >> >> This has not been written, but is not that hard. You would first bin the >> points into cells, and then use the >> interpolant from the discretization. This is how we do it in the >> unstructured case. Actually, if you wrote >> your nonuniform 1D meshes as DMPlexes, I think it would work right now >> (have not tested). >> >> Thanks, >> >> Matt >> >> >>> Francesco >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> >> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 8 02:40:44 2018 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 8 Nov 2018 09:40:44 +0100 Subject: [petsc-users] [petsc-maint] Correct use of PCFactorSetMatOrderingType In-Reply-To: References: Message-ID: Hello Mark, Yes, there are 5 KSP calls within a time-step (3 for the solution of momentum equation + 2 for the solution of pressure), this is the classical non iterative PISO by Issa ( the exact sequence of operations is : solve momentum implicitly, solve pressure-correction, momentum explicitly, pressure correction). The pressure correction equation ,which is something similar to a Poisson equation for incompressible flows, is the one that determines the overall performance in my code such as in the others. Usually, when the pressure is being solved for the second time, the solution is faster since there is a better input guess and, as in my case, the preconditioner is not recomputed again. Have you got some advices for the multigrid configuration in this scenario, which are not the default one, in order to increase performances? I do not know if this may impact drastically the performance, but I am running on a E4 workstation with 16 Intel's Xeon processors (2.3GH/12MB cache) and 128GB of RAM . Thank you very much for your helpful comments, Edoardo ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica Universita' di Genova 1, via Montallegro 16145 Genova, Italy email: edoardo.alinovi at dicca.unige.it Tel: +39 010 353 2540 Il giorno mer 7 nov 2018 alle ore 17:59 Mark Adams ha scritto: > please respond to petsc-users. > > You are doing 5 solves here in 14 seconds. You seem to be saying that the > two pressure solves are taking all of this time. I don't know why the two > solves are different. > > You seem to be saying that OpenFOAM solves the problem in 10 seconds and > PETSc solves it in 14 seconds. Is that correct? Hypre seems to be running > fine. > > > > On Wed, Nov 7, 2018 at 11:24 AM Edoardo alinovi > wrote: > >> Thanks a lot Mark for your kind replay. The solver is mine and I use >> PETSc for the solution of momentum and pressure. The first is solved very >> fast by a standard bcgs + bjacobi, but the pressure is the source of all >> evils and, unfortunately, I am pretty sure that almost all the time within >> the time-step is needed by KSP to solve the pressure (see log attached). I >> have verified this also putting a couple of mpi_wtime around the kspsolve >> call. The pressure is solved 2 times (1 prediction + 1 correction), the >> prediction takes around 11s , the correction around 4s (here I am avoiding >> to recompute the preconditioner), all the rest of the code (flux assembling >> + mometum solution + others) around 1s. Openfoam does the same procedure >> with the same tolerance in 10s using its gamg version (50 it to converge). >> The number of iteration required to solve the pressure with hypre are 12. >> Gamg performs similarly to hypre in terms of speed, but with 50 iterations >> to converge. Am I missing something in the setup in your opinion? >> >> thanks a lot, >> >> Edo >> >> ------ >> >> Edoardo Alinovi, Ph.D. >> >> DICCA, Scuola Politecnica >> Universita' di Genova >> 1, via Montallegro >> 16145 Genova, Italy >> >> email: edoardo.alinovi at dicca.unige.it >> Tel: +39 010 353 2540 >> >> >> >> >> Il giorno mer 7 nov 2018 alle ore 16:50 Mark Adams ha >> scritto: >> >>> You can try -pc_type gamg, but hypre is a pretty good solver for the >>> Laplacian. If hypre is just a little faster than LU on a 3D problem (that >>> takes 10 seconds to solve) then AMG is not doing well. I would expect that >>> AMG is taking a lot of iterations (eg, >> 10). You can check that with >>> -ksp_monitor. >>> >>> The PISO algorithm is a multistage algorithm with a pressure correction >>> in it. It also has a solve for the velocity, from what I can tell. Are you >>> building PISO yourself and using PETSc just for the pressure correction? >>> Are you sure the time is spent in this solver? You can use -log_view to see >>> performance numbers and look for KSPSolve to see how much time is spent in >>> the PETSc solver. >>> >>> Mark >>> >>> >>> On Wed, Nov 7, 2018 at 10:26 AM Zhang, Hong via petsc-maint < >>> petsc-maint at mcs.anl.gov> wrote: >>> >>>> Edoardo: >>>> Forwarding your request to petsc-maint where you can get fast and >>>> expert advise. I do not have suggestion for your application, but someone >>>> in our team likely will make suggestion. >>>> Hong >>>> >>>> Hello Hong, >>>>> >>>>> Well, using -sub_pc_type lu it super slow. I am desperately triying >>>>> to enhance performaces of my code (CFD, finite volume, PISO alghoritm), in >>>>> particular I have a strong bottleneck in the solution of pressure >>>>> correction equation which takes almost the 90% of computational time. Using >>>>> multigrid as preconditoner (hypre with default options) is slighlty >>>>> better, but comparing the results against the multigrid used in openFOAM, >>>>> my code is losing 10s/iteration which a huge amount of time. Now, since >>>>> that all the time is employed by KSPSolve, I feel a bit powerless. Do you >>>>> have any helpful advice? >>>>> >>>>> Thank you very much! >>>>> ------ >>>>> >>>>> Edoardo Alinovi, Ph.D. >>>>> >>>>> DICCA, Scuola Politecnica >>>>> Universita' di Genova >>>>> 1, via Montallegro >>>>> 16145 Genova, Italy >>>>> >>>>> email: edoardo.alinovi at dicca.unige.it >>>>> Tel: +39 010 353 2540 >>>>> >>>>> >>>>> >>>>> >>>>> Il giorno mar 6 nov 2018 alle ore 17:15 Zhang, Hong < >>>>> hzhang at mcs.anl.gov> ha scritto: >>>>> >>>>>> Edoardo: >>>>>> Interesting. I thought it would not affect performance much. What >>>>>> happens if you use -sub_pc_type lu'? >>>>>> Hong >>>>>> >>>>>> Dear Hong and Matt, >>>>>>> >>>>>>> thank you for your kind replay. I have just tested your suggestions >>>>>>> and applied " -sub_pc_type ilu -sub_pc_factor_mat_ordering_type nd/rcm" >>>>>>> and, in both cases, I have found a deterioration of performances >>>>>>> with respect to doing nothing (thus just putting default PCBJACOBI). Is it >>>>>>> normal? However, I guess this is very problem dependent. >>>>>>> ------ >>>>>>> >>>>>>> Edoardo Alinovi, Ph.D. >>>>>>> >>>>>>> DICCA, Scuola Politecnica >>>>>>> Universita' di Genova >>>>>>> 1, via Montallegro >>>>>>> 16145 Genova, Italy >>>>>>> >>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>> Tel: +39 010 353 2540 >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> Il giorno mar 6 nov 2018 alle ore 16:04 Zhang, Hong < >>>>>>> hzhang at mcs.anl.gov> ha scritto: >>>>>>> >>>>>>>> Edoardo: >>>>>>>> You can test runtime option '-sub_pc_factor_mat_ordering_type' and >>>>>>>> use '-log_view' to get performance on different orderings, >>>>>>>> e.g.,petsc/src/ksp/ksp/examples/tutorials/ex2.c: >>>>>>>> mpiexec -n 2 ./ex2 -ksp_view -sub_pc_type ilu >>>>>>>> -sub_pc_factor_mat_ordering_type nd >>>>>>>> >>>>>>>> I do not think the ordering inside block for ilu would affect >>>>>>>> performance much. Let us know what you will get. >>>>>>>> Hong >>>>>>>> >>>>>>>> Dear users, >>>>>>>>> >>>>>>>>> I have a question about the correct use of the option >>>>>>>>> "PCFactorSetMatOrderingType" in PETSc. >>>>>>>>> >>>>>>>>> I am solving a problem with 2.5M of unknowns distributed along 16 >>>>>>>>> processors and I am using the block jacobi preconditioner and MPIAIJ >>>>>>>>> matrix format. I cannot figure out if the above option can be useful or not >>>>>>>>> in decreasing the computational time. Any suggestion or tips? >>>>>>>>> >>>>>>>>> Thank you very much for the kind help >>>>>>>>> >>>>>>>>> ------ >>>>>>>>> >>>>>>>>> Edoardo Alinovi, Ph.D. >>>>>>>>> >>>>>>>>> DICCA, Scuola Politecnica >>>>>>>>> Universita' di Genova >>>>>>>>> 1, via Montallegro >>>>>>>>> 16145 Genova, Italy >>>>>>>>> >>>>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>>>> Tel: +39 010 353 2540 >>>>>>>>> >>>>>>>>> >>>>>>>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From francesco_magaletti at fastwebnet.it Thu Nov 8 02:43:10 2018 From: francesco_magaletti at fastwebnet.it (Francesco Magaletti) Date: Thu, 8 Nov 2018 09:43:10 +0100 Subject: [petsc-users] need help with vector interpolation on nonuniform DMDA grids In-Reply-To: References: <27356998-760A-431D-BB28-9FA41CA0ED40@fastwebnet.it> <669D57CD-9870-4A79-94C9-5258442FEF4E@fastwebnet.it> Message-ID: Dear Matt & Dave, I really appreciate your support! I was not aware of the existence of DMSwarm object and it has been a really exiting discovery, since in our research group we have people working with kind of PIC methods and I suppose that they will be grateful for your work Dave! Concerning my problem, I will give your suggestions a try! I?ll write you back if I miss some technical points with using the new features of DMSwarm. Thank you again, Francesco From tempohoper at gmail.com Thu Nov 8 03:54:49 2018 From: tempohoper at gmail.com (Sal Am) Date: Thu, 8 Nov 2018 09:54:49 +0000 Subject: [petsc-users] Vec, Mat and binaryfiles. In-Reply-To: <87muqkna5e.fsf@jedbrown.org> References: <87r2fzpj0m.fsf@jedbrown.org> <87muqkna5e.fsf@jedbrown.org> Message-ID: Thanks, I missed that, I do care about scalability, the long term plan is to make it fully compatible on a cluster so I guess I would need to convert it to PETSc binary format before reading it in. Is this the file you were referring to https://github.com/erdc/petsc-dev/blob/master/bin/pythonscripts/PetscBinaryIO.py Cannot seem to find docs on how to use it, do you happen to have any examples? All the best, S On Wed, Nov 7, 2018 at 4:07 PM Jed Brown wrote: > Please always use "reply-all" so that your messages go to the list. > This is standard mailing list etiquette. It is important to preserve > threading for people who find this discussion later and so that we do > not waste our time re-answering the same questions that have already > been answered in private side-conversations. You'll likely get an > answer faster that way too. > > Sal Am writes: > > > Thank you Jed for the quick response! > > > >> > >> Yes, of course the formats would have to match. I would recommend > >> writing the files in an existing format such as PETSc's binary format. > >> > > > > Unfortunately I do not think I can change the source code to output those > > two files in PETSc format and it would probably take a very long time > > converting everything into PETSc (it is not even my code). > > File-based workflows are very often bottlenecks. You can also use any > convenient software (e.g., Python or Matlab/Octave) to convert your > custom binary formats to PETSc binary format (see PetscBinaryIO provided > with PETSc), at which point you'll be able to read in parallel. If you > don't care about scalability or only need to read once, then you can > write code of the type you propose. > > > Do you have any other suggestions on how to read in those two complex > > binary files? Also why would it be more difficult to parallelise as > > I thought getting the two files in PETSc vector format would allow me to > > use the rest of PETSc library the usual way? > > > > Kind regards, > > S > > > > > > On Mon, Nov 5, 2018 at 4:49 PM Jed Brown wrote: > > > >> Sal Am via petsc-users writes: > >> > >> > Hi, > >> > > >> > I am trying to solve a Ax=b complex system. the vector b and "matrix" > A > >> are > >> > both binary and NOT created by PETSc. So I keep getting error messages > >> that > >> > they are not correct format when I read the files with > >> PetscViewBinaryOpen, > >> > after some digging it seems that one cannot just read a binary file > that > >> > was created by another software. > >> > >> Yes, of course the formats would have to match. I would recommend > >> writing the files in an existing format such as PETSc's binary format. > >> While the method you describe can be made to work, it will be more work > >> to make it parallel. > >> > >> > How would I go on to solve this problem? > >> > > >> > More info and trials: > >> > > >> > "matrix" A consists of two files, one that contains row column index > >> > numbers and one that contains the non-zero values. So what I would > have > >> to > >> > do is multiply the last term in a+b with PETSC_i to get a real + > >> imaginary > >> > vector A. > >> > > >> > vector b is in binary, so what I have done so far (not sure if it > works) > >> is: > >> > > >> > std::ifstream input("Vector_b.bin", std::ios::binary ); > >> > while (input.read(reinterpret_cast(&v), sizeof(float))) > >> > ierr = > VecSetValues(u,1,&iglobal,&v,INSERT_VALUES);CHKERRQ(ierr); > >> > > >> > where v is a PetscScalar. > >> > > >> > Once I am able to read both matrices I think I can figure out the > solvers > >> > to solve the system. > >> > > >> > All the best, > >> > S > >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 8 04:09:54 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 8 Nov 2018 05:09:54 -0500 Subject: [petsc-users] Vec, Mat and binaryfiles. In-Reply-To: References: <87r2fzpj0m.fsf@jedbrown.org> <87muqkna5e.fsf@jedbrown.org> Message-ID: On Thu, Nov 8, 2018 at 4:56 AM Sal Am via petsc-users < petsc-users at mcs.anl.gov> wrote: > Thanks, I missed that, > > I do care about scalability, the long term plan is to make it fully > compatible on a cluster so I guess I would need to convert it to PETSc > binary format before reading it in. > > Is this the file you were referring to > https://github.com/erdc/petsc-dev/blob/master/bin/pythonscripts/PetscBinaryIO.py > > Cannot seem to find docs on how to use it, do you happen to have any > examples? > The documentation is at the top of the file. You can see it by clicking the link. Matt > All the best, > S > > > > > On Wed, Nov 7, 2018 at 4:07 PM Jed Brown wrote: > >> Please always use "reply-all" so that your messages go to the list. >> This is standard mailing list etiquette. It is important to preserve >> threading for people who find this discussion later and so that we do >> not waste our time re-answering the same questions that have already >> been answered in private side-conversations. You'll likely get an >> answer faster that way too. >> >> Sal Am writes: >> >> > Thank you Jed for the quick response! >> > >> >> >> >> Yes, of course the formats would have to match. I would recommend >> >> writing the files in an existing format such as PETSc's binary format. >> >> >> > >> > Unfortunately I do not think I can change the source code to output >> those >> > two files in PETSc format and it would probably take a very long time >> > converting everything into PETSc (it is not even my code). >> >> File-based workflows are very often bottlenecks. You can also use any >> convenient software (e.g., Python or Matlab/Octave) to convert your >> custom binary formats to PETSc binary format (see PetscBinaryIO provided >> with PETSc), at which point you'll be able to read in parallel. If you >> don't care about scalability or only need to read once, then you can >> write code of the type you propose. >> >> > Do you have any other suggestions on how to read in those two complex >> > binary files? Also why would it be more difficult to parallelise as >> > I thought getting the two files in PETSc vector format would allow me to >> > use the rest of PETSc library the usual way? >> > >> > Kind regards, >> > S >> > >> > >> > On Mon, Nov 5, 2018 at 4:49 PM Jed Brown wrote: >> > >> >> Sal Am via petsc-users writes: >> >> >> >> > Hi, >> >> > >> >> > I am trying to solve a Ax=b complex system. the vector b and >> "matrix" A >> >> are >> >> > both binary and NOT created by PETSc. So I keep getting error >> messages >> >> that >> >> > they are not correct format when I read the files with >> >> PetscViewBinaryOpen, >> >> > after some digging it seems that one cannot just read a binary file >> that >> >> > was created by another software. >> >> >> >> Yes, of course the formats would have to match. I would recommend >> >> writing the files in an existing format such as PETSc's binary format. >> >> While the method you describe can be made to work, it will be more work >> >> to make it parallel. >> >> >> >> > How would I go on to solve this problem? >> >> > >> >> > More info and trials: >> >> > >> >> > "matrix" A consists of two files, one that contains row column index >> >> > numbers and one that contains the non-zero values. So what I would >> have >> >> to >> >> > do is multiply the last term in a+b with PETSC_i to get a real + >> >> imaginary >> >> > vector A. >> >> > >> >> > vector b is in binary, so what I have done so far (not sure if it >> works) >> >> is: >> >> > >> >> > std::ifstream input("Vector_b.bin", std::ios::binary ); >> >> > while (input.read(reinterpret_cast(&v), sizeof(float))) >> >> > ierr = >> VecSetValues(u,1,&iglobal,&v,INSERT_VALUES);CHKERRQ(ierr); >> >> > >> >> > where v is a PetscScalar. >> >> > >> >> > Once I am able to read both matrices I think I can figure out the >> solvers >> >> > to solve the system. >> >> > >> >> > All the best, >> >> > S >> >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amartin at cimne.upc.edu Thu Nov 8 05:41:24 2018 From: amartin at cimne.upc.edu (=?UTF-8?B?IkFsYmVydG8gRi4gTWFydMOtbiI=?=) Date: Thu, 08 Nov 2018 12:41:24 +0100 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue In-Reply-To: References: <5BE30C87.10204@cimne.upc.edu> Message-ID: <5BE420E4.8020800@cimne.upc.edu> Dear Mark, thanks for your quick and comprehensive reply. Before moving to the results of the experiments that u suggested, let me clarify two points on my original e-mail and your answer: (1) The raw timings and #iters. provided in my first e-mail were actually obtained with "-pc_gamg_square_graph 1" (and not 0); sorry about that, my mistake. (the logs, though, were consistent with the solver configuration provided). The raw figures with "-pc_gamg_square_graph 0" are actually as follows: (load3): [0.25074561, 0.3650926566, 0.6251466936, 0.8709517661, 15.52180776] (load3): [0.148803731, 0.325266364, 0.5538515123, 0.7537377281, 1.475100923] (load3): [8, 9, 11, 12, 12] Bottom line: significant improvement of absolute times for the first 4x problems, marginal improvement for the largest problem (compared to "-pc_gamg_square_graph 1") (2) <> This disagreement is justified by the following note on my original e-mail: <> I tried new configurations based on your suggestions. Find attached the results. (legends indicate changes with respect to the solver configuration provided in my first e-mail). Bottom lines: (1) the configuration provided in my original e-mail leads to fastest execution and less number of iteration for the first 4x problems. (2) *The (new) parameter-value combinations** **suggested seem to have almost no impact into the preconditioner set up time of the last problem.** *I also tried HYPRE-BoomerAMG as suggested, with two different configurations. *** SYMMETRIC CONFIGURATION *** -ksp_type cg -ksp_monitor -ksp_rtol 1.0e-6 -ksp_converged_reason -ksp_max_it 500 -ksp_norm_type unpreconditioned -ksp_view -log_view -pc_type hypre -pc_hypre_type boomeramg -pc_hypre_boomeramg_print_statistics 1 -pc_hypre_boomeramg_strong_threshold 0.25 -pc_hypre_boomeramg_coarsen_type HMIS -pc_hypre_boomeramg_relax_type_down symmetric-SOR/Jacobi -pc_hypre_boomeramg_relax_type_up symmetric-SOR/Jacobi -pc_hypre_boomeramg_relax_type_coarse Gaussian-elimination *** UNSYMMETRIC CONFIGURATION *** -ksp_type gmres -ksp_gmres_restart 500 -ksp_monitor -ksp_rtol 1.0e-6 -ksp_converged_reason -ksp_max_it 500 -ksp_pc_side right -ksp_norm_type unpreconditioned -pc_type hypre -pc_hypre_type boomeramg -pc_hypre_boomeramg_print_statistics 1 -pc_hypre_boomeramg_strong_threshold 0.25 -pc_hypre_boomeramg_coarsen_type HMIS -pc_hypre_boomeramg_relax_type_down SOR/Jacobi -pc_hypre_boomeramg_relax_type_up SOR/Jacobi -pc_hypre_boomeramg_relax_type_coarse Gaussian-elimination The raw results were: *** SYMMETRIC CONFIGURATION *** (load3): [0.1828534687, 0.3055133289, 0.3582984209, 0.4280304033, 1.343549139] (load3): [0.2102472978, 0.4572948301, 0.7153297188, 0.9989531627, N/A] (load3): [19, 23, 26, 28, 'DIVERGED_INDEFINITE_PC'] *** UNSYMMETRIC CONFIGURATION *** (load3): [0.1841227429, 0.3082743008, 0.3652294828, 0.4654760892, 1.331299786] (load3): [0.1194557019, 0.2830136018, 0.5046830242, 1.363314636, N/A] (load3): [15, 19, 24, 48, DIVERGED_ITS] Thus, the largest problem also seems to cause (even more severe) issues to HYPRE, in particular, INDEFINITE PRECONDITIONER with CG, and not convergence within 500 iterations for GMRES. The preconditioner set up stage time, though, scales reasonably well with the same data distribution that we used to feed GAMG (although the preconditioner computed for the largest problem seems to be totally useless). I have logs for all these results if required. Thanks for your help! Best regards, Alberto. On 07/11/18 19:46, Mark Adams wrote: > First I would add -gamg_est_ksp_type cg > > You seem to be converging well so I assume you are setting the null > space for GAMG. > > Note, you should test hypre also. > > You probably want a bigger "-pc_gamg_process_eq_limit 50". 200 at > least but you test your machine with a range on the largest problem. > This is a parameter for reducing the number of active processors (on > coarse grids). > > I would only worry about "load3". This has 16K equations per process, > which is where you start noticing "strong scaling" problems, depending > on the machine. > > An important parameter is "-pc_gamg_square_graph 0". I would probably > start with infinity (eg, 10). > > Now, I'm not sure about your domain, problem sizes, and thus the weak > scaling design. You seem to be scaling on the background mesh, but > that may not be a good proxy for complexity. > > You can look at the number of flops and scale it appropriately by the > number of solver iterations to get a relative size of the problem. I > would recommend scaling the number of processors with this. For > instance here the MatMult line for the 4 proc and 16K proc run: > > ------------------------------------------------------------------------------------------------------------------------ > Event Count Time (sec) Flop --- Global --- --- Stage --- > Total > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F > %M %L %R %T %F %M %L %R Mflop/s > ------------------------------------------------------------------------------------------------------------------------ > MatMult 636 1.0 1.9035e-01 1.0 3.12e+08 1.1 7.6e+03 3.0e+03 0.0e+00 0 > 47 62 44 0 0 47 62 44 0 6275 [2 procs] > MatMult 1416 1.0 1.9601e+002744.6 4.82e+08 0.0 4.3e+08 7.2e+02 > 0.0e+00 0 48 50 48 0 0 48 50 48 0 2757975 [16K procs] > > Now, you have empty processors. See the massive load imbalance on time > and the zero on Flops. The "Ratio" is max/min and cleary min=0 so > PETSc reports a ratio of 0 (it is infinity really). > > Also, weak scaling on a thin body (I don't know your domain) is a > little funny because as the problem scales up the mesh becomes more 3D > and this causes the cost per equation to go up. That is why I prefer > to use the number of non-zeros as the processor scaling function but > number of equations is easier ... > > The PC setup times are large (I see 48 seconds at 16K bu you report > 16). -pc_gamg_square_graph 10 should help that. > > The max number of flops per processor in MatMult goes up by 50% and > the max time goes up by 10x and the number of iterations goes up by > 13/8. If I put all of this together I get that 75% of the time at 16K > is in communication at 16K. I think that and the absolute time can be > improved some by optimizing parameters as I've suggested. > > Mark > > > > > > On Wed, Nov 7, 2018 at 11:03 AM "Alberto F. Mart?n" via petsc-users > > wrote: > > Dear All, > > we are performing a weak scaling test of the PETSc (v3.9.0) GAMG > preconditioner when applied to the linear system arising > from the *conforming unfitted FE discretization *(using Q1 > Lagrangian FEs) of a 3D PDE Poisson problem, where > the boundary of the domain (a popcorn flake) is described as a > zero-level-set embedded within a uniform background > (Cartesian-like) hexahedral mesh. Details underlying the FEM > formulation can be made available on demand if you > believe that this might be helpful, but let me just point out that > it is designed such that it addresses the well-known > ill-conditioning issues of unfitted FE discretizations due to the > small cut cell problem. > > The weak scaling test is set up as follows. We start from a single > cube background mesh, and refine it uniformly several > steps, until we have approximately either 10**3 (load1), 20**3 > (load2), or 40**3 (load3) hexahedra/MPI task when > distributing it over 4 MPI tasks. The benchmark is scaled such > that the next larger scale problem to be tested is obtained > by uniformly refining the mesh from the previous scale and running > it on 8x times the number of MPI tasks that we used > in the previous scale. As a result, we obtain three weak scaling > curves for each of the three fixed loads per MPI task > above, on the following total number of MPI tasks: 4, 32, 262, > 2097, 16777. The underlying mesh is not partitioned among > MPI tasks using ParMETIS (unstructured multilevel graph > partitioning) nor optimally by hand, but following the so-called > z-shape space-filling curves provided by an underlying octree-like > mesh handler (i.e., p4est library). > > I configured the preconditioned linear solver as follows: > > -ksp_type cg > -ksp_monitor > -ksp_rtol 1.0e-6 > -ksp_converged_reason > -ksp_max_it 500 > -ksp_norm_type unpreconditioned > -ksp_view > -log_view > > -pc_type gamg > -pc_gamg_type agg > -mg_levels_esteig_ksp_type cg > -mg_coarse_sub_pc_type cholesky > -mg_coarse_sub_pc_factor_mat_ordering_type nd > -pc_gamg_process_eq_limit 50 > -pc_gamg_square_graph 0 > -pc_gamg_agg_nsmooths 1 > > Raw timings (in seconds) of the preconditioner set up and PCG > iterative solution stage, and number of iterations are as follows: > > **preconditioner set up** > (load1): [0.02542160451, 0.05169247743, 0.09266782179, > 0.2426272957, 13.64161944] > (load2): [0.1239175797 , 0.1885528499 , 0.2719282564 , > 0.4783878336, 13.37947339] > (load3): [0.6565349903 , 0.9435049873 , 1.299908397 , > 1.916243652 , 16.02904088] > > **PCG stage** > (load1): [0.003287350759, 0.008163803257, 0.03565631993, > 0.08343045413, 0.6937994603] > (load2): [0.0205939794 , 0.03594723623 , 0.07593298424, > 0.1212046621 , 0.6780373845] > (load3): [0.1310882876 , 0.3214917686 , 0.5532023879 , > 0.766881627 , 1.485446003] > > **number of PCG iterations** > (load1): [5, 8, 11, 13, 13] > (load2): [7, 10, 12, 13, 13] > (load3): [8, 10, 12, 13, 13] > > It can be observed that both the number of linear solver > iterations and the PCG stage timings (weakly) > scale remarkably, but t*here is a significant time increase when > scaling the problem from 2097 to 16777 MPI tasks ** > **for the preconditioner setup stage* (e.g., 1.916243652 vs > 16.02904088 sec. with 40**3 cells per MPI task). > I gathered the combined output of -ksp_view and -log_view (only) > for all the points involving the load3 weak scaling > test (find them attached to this message). Please note that within > each run, I execute the these two stages up-to > three times, and this influences absolute timings given in -log_view. > > Looking at the output of -log_view, it is very strange to me, > e.g., that the stage labelled as "Graph" > does not scale properly as it is just a call to MatDuplicate if > the block size of the matrix is 1 (our case), and > I guess that it is just a local operation that does not require > any communication. > What I am missing here? The load does not seem to be unbalanced > looking at the "Ratio" column. > > I wonder whether the observed behaviour is as expected, or this a > miss-configuration of the solver from our side. > I played (quite a lot) with several parameter-value combinations, > and the configuration above is the one that led to fastest > execution (from the ones tested, that might be incomplete, I can > also provide further feedback if helpful). > Any feedback that we can get from your experience in order to find > the cause(s) of this issue and a mitigating solution > will be of high added value. > > Thanks very much in advance! > Best regards, > Alberto. > > -- > Alberto F. Mart?n-Huertas > Senior Researcher, PhD. Computational Science > Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) > Parc Mediterrani de la Tecnologia, UPC > Esteve Terradas 5, Building C3, Office 215, > 08860 Castelldefels (Barcelona, Spain) > Tel.: (+34) 9341 34223 > e-mail:amartin at cimne.upc.edu > > FEMPAR project co-founder > web:http://www.fempar.org > > ________________ > IMPORTANT NOTICE > All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in > order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by > letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. > -- Alberto F. Mart?n-Huertas Senior Researcher, PhD. Computational Science Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) Parc Mediterrani de la Tecnologia, UPC Esteve Terradas 5, Building C3, Office 215, 08860 Castelldefels (Barcelona, Spain) Tel.: (+34) 9341 34223 e-mail:amartin at cimne.upc.edu FEMPAR project co-founder web: http://www.fempar.org ________________ IMPORTANT NOTICE All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- (A) ** Added NearNullSpace to matrix (i.e. the constant vector) ** (load3): [0.2512322953, 0.3657070249, 0.6209384622, 0.8898622398, 16.37409958] (load3): [0.1474562958, 0.3245896269, 0.551462595 , 0.7768286369, 1.563904478] (load3): [8, 9, 11, 12, 12] (B) ** (A) + -gamg_est_ksp_type cg** (load3): [0.2532081502, 0.3669248847, 0.6215682998, 0.9122101571, 15.82921874] (load3): [0.1476225629, 0.3242742592, 0.5494060389, 0.793106758, 1.541510889] (load3): [8, 9, 11, 12, 12] (C) ** (B) + -pc_gamg_square_graph 10** (load3): [0.7063658834, 1.045530763, 1.403756126, 1.903321964, 16.91176975] (load3): [0.1308690757, 0.3190896986, 0.5635806862, 0.790503782, 1.528392129] (load3): [8, 10, 12, 14, 15] (D) ** (C) + -pc_gamg_process_eq_limit 200** (load3): [0.7066891911, 1.041900044, 1.438325046, 2.154289208, 15.54656001] (load3): [0.1325668963, 0.3205731977, 0.5486685866, 0.8334027417, 1.485407834] (load3): [8, 10, 12, 14, 15] (E) ** (C) + -pc_gamg_process_eq_limit 500** (load3): [0.7349723065, 1.084142983, 1.562717193, 2.198781526, 16.83547859] (load3): [0.1336050248, 0.3177526584, 0.5764533961, 0.8126104074, 1.661861523] (load3): [8, 10, 12, 14, 15] (F) ** (C) + -pc_gamg_process_eq_limit 1000** (3, 'a0b0c0d0e0f0g0h0i0'): [0.739308523, 1.117045472, 1.54470065, 2.845281176, 16.66935678] (3, 'a0b0c0d0e0f0g0h0i0'): [0.1373377964, 0.3255409142, 0.5619245535, 0.8124665194, 1.660140919] (3, 'a0b0c0d0e0f0g0h0i0'): [8, 10, 12, 13, 15] From knepley at gmail.com Thu Nov 8 06:01:13 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 8 Nov 2018 07:01:13 -0500 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue In-Reply-To: <5BE420E4.8020800@cimne.upc.edu> References: <5BE30C87.10204@cimne.upc.edu> <5BE420E4.8020800@cimne.upc.edu> Message-ID: On Thu, Nov 8, 2018 at 6:41 AM "Alberto F. Mart?n" via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear Mark, > > thanks for your quick and comprehensive reply. > > Before moving to the results of the experiments that u suggested, let me > clarify two points > on my original e-mail and your answer: > > (1) The raw timings and #iters. provided in my first e-mail were actually > obtained with "-pc_gamg_square_graph 1" (and not 0); sorry about > that, my mistake. > (the logs, though, were consistent with the solver configuration > provided). > The raw figures with "-pc_gamg_square_graph 0" are actually as > follows: > > (load3): [0.25074561, 0.3650926566, 0.6251466936, 0.8709517661, > 15.52180776] > (load3): [0.148803731, 0.325266364, 0.5538515123, 0.7537377281, > 1.475100923] > (load3): [8, 9, 11, 12, 12] > > Bottom line: significant improvement of absolute times for the first > 4x problems, marginal improvement for > the largest problem (compared to > "-pc_gamg_square_graph 1") > > (2) <<*The PC setup times are large (I see 48 seconds at 16K but you > report 16). * > * -pc_gamg_square_graph 10 should help that.*>> > > This disagreement is justified by the following note on my original > e-mail: > > <<*Please note that within each run, I execute these two > stages up-to* > * three times, and this influences absolute timings given in > -log_view.*>> > > I tried new configurations based on your suggestions. Find attached the > results. > (legends indicate changes with respect to the solver configuration > provided > in my first e-mail). > > Bottom lines: (1) the configuration provided in my original e-mail leads > to fastest execution > and less number of iteration for the first 4x problems. (2) *The (new) > parameter-value combinations* > *suggested seem to have almost no impact into the preconditioner set up > time of the last problem.* > Mark, could this bad setup just be non-scalability in ParMetis? How do we see the ParMetis time? Thanks, Matt > > I also tried HYPRE-BoomerAMG as suggested, with two different > configurations. > > *** SYMMETRIC CONFIGURATION *** > -ksp_type cg > -ksp_monitor > -ksp_rtol 1.0e-6 > -ksp_converged_reason > -ksp_max_it 500 > -ksp_norm_type unpreconditioned > -ksp_view > -log_view > > -pc_type hypre > -pc_hypre_type boomeramg > -pc_hypre_boomeramg_print_statistics 1 > -pc_hypre_boomeramg_strong_threshold 0.25 > -pc_hypre_boomeramg_coarsen_type HMIS > -pc_hypre_boomeramg_relax_type_down symmetric-SOR/Jacobi > -pc_hypre_boomeramg_relax_type_up symmetric-SOR/Jacobi > -pc_hypre_boomeramg_relax_type_coarse Gaussian-elimination > > *** UNSYMMETRIC CONFIGURATION *** > -ksp_type gmres > -ksp_gmres_restart 500 > -ksp_monitor > -ksp_rtol 1.0e-6 > -ksp_converged_reason > -ksp_max_it 500 > -ksp_pc_side right > -ksp_norm_type unpreconditioned > > -pc_type hypre > -pc_hypre_type boomeramg > -pc_hypre_boomeramg_print_statistics 1 > -pc_hypre_boomeramg_strong_threshold 0.25 > -pc_hypre_boomeramg_coarsen_type HMIS > -pc_hypre_boomeramg_relax_type_down SOR/Jacobi > -pc_hypre_boomeramg_relax_type_up SOR/Jacobi > -pc_hypre_boomeramg_relax_type_coarse Gaussian-elimination > > The raw results were: > > *** SYMMETRIC CONFIGURATION *** > > (load3): [0.1828534687, 0.3055133289, 0.3582984209, 0.4280304033, > 1.343549139] > (load3): [0.2102472978, 0.4572948301, 0.7153297188, 0.9989531627, N/A] > (load3): [19, 23, 26, 28, 'DIVERGED_INDEFINITE_PC'] > > *** UNSYMMETRIC CONFIGURATION *** > > (load3): [0.1841227429, 0.3082743008, 0.3652294828, 0.4654760892, > 1.331299786] > (load3): [0.1194557019, 0.2830136018, 0.5046830242, 1.363314636, N/A] > (load3): [15, 19, 24, 48, DIVERGED_ITS] > > Thus, the largest problem also seems to cause (even more severe) issues to > HYPRE, in particular, > INDEFINITE PRECONDITIONER with CG, and not convergence within 500 > iterations for GMRES. > The preconditioner set up stage time, though, scales reasonably well with > the same data distribution > that we used to feed GAMG (although the preconditioner computed for the > largest problem seems to be > totally useless). > > I have logs for all these results if required. > > Thanks for your help! > Best regards, > Alberto. > > > > On 07/11/18 19:46, Mark Adams wrote: > > First I would add -gamg_est_ksp_type cg > > You seem to be converging well so I assume you are setting the null space > for GAMG. > > Note, you should test hypre also. > > You probably want a bigger "-pc_gamg_process_eq_limit 50". 200 at least > but you test your machine with a range on the largest problem. This is a > parameter for reducing the number of active processors (on coarse grids). > > I would only worry about "load3". This has 16K equations per process, > which is where you start noticing "strong scaling" problems, depending on > the machine. > > An important parameter is "-pc_gamg_square_graph 0". I would probably > start with infinity (eg, 10). > > Now, I'm not sure about your domain, problem sizes, and thus the weak > scaling design. You seem to be scaling on the background mesh, but that may > not be a good proxy for complexity. > > You can look at the number of flops and scale it appropriately by the > number of solver iterations to get a relative size of the problem. I would > recommend scaling the number of processors with this. For instance here the > MatMult line for the 4 proc and 16K proc run: > > > ------------------------------------------------------------------------------------------------------------------------ > Event Count Time (sec) Flop > --- Global --- --- Stage --- Total > Max Ratio Max Ratio Max Ratio Mess Avg len > Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > ------------------------------------------------------------------------------------------------------------------------ > MatMult 636 1.0 1.9035e-01 1.0 3.12e+08 1.1 7.6e+03 3.0e+03 > 0.0e+00 0 47 62 44 0 0 47 62 44 0 6275 [2 procs] > MatMult 1416 1.0 1.9601e+002744.6 4.82e+08 0.0 4.3e+08 > 7.2e+02 0.0e+00 0 48 50 48 0 0 48 50 48 0 2757975 [16K procs] > > Now, you have empty processors. See the massive load imbalance on time > and the zero on Flops. The "Ratio" is max/min and cleary min=0 so PETSc > reports a ratio of 0 (it is infinity really). > > Also, weak scaling on a thin body (I don't know your domain) is a little > funny because as the problem scales up the mesh becomes more 3D and this > causes the cost per equation to go up. That is why I prefer to use the > number of non-zeros as the processor scaling function but number of > equations is easier ... > > The PC setup times are large (I see 48 seconds at 16K bu you report 16). > -pc_gamg_square_graph 10 should help that. > > The max number of flops per processor in MatMult goes up by 50% and the > max time goes up by 10x and the number of iterations goes up by 13/8. If I > put all of this together I get that 75% of the time at 16K is in > communication at 16K. I think that and the absolute time can be improved > some by optimizing parameters as I've suggested. > > Mark > > > > > > On Wed, Nov 7, 2018 at 11:03 AM "Alberto F. Mart?n" via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Dear All, >> >> we are performing a weak scaling test of the PETSc (v3.9.0) GAMG >> preconditioner when applied to the linear system arising >> from the *conforming unfitted FE discretization *(using Q1 Lagrangian >> FEs) of a 3D PDE Poisson problem, where >> the boundary of the domain (a popcorn flake) is described as a >> zero-level-set embedded within a uniform background >> (Cartesian-like) hexahedral mesh. Details underlying the FEM formulation >> can be made available on demand if you >> believe that this might be helpful, but let me just point out that it is >> designed such that it addresses the well-known >> ill-conditioning issues of unfitted FE discretizations due to the small >> cut cell problem. >> >> The weak scaling test is set up as follows. We start from a single cube >> background mesh, and refine it uniformly several >> steps, until we have approximately either 10**3 (load1), 20**3 (load2), >> or 40**3 (load3) hexahedra/MPI task when >> distributing it over 4 MPI tasks. The benchmark is scaled such that the >> next larger scale problem to be tested is obtained >> by uniformly refining the mesh from the previous scale and running it on >> 8x times the number of MPI tasks that we used >> in the previous scale. As a result, we obtain three weak scaling curves >> for each of the three fixed loads per MPI task >> above, on the following total number of MPI tasks: 4, 32, 262, 2097, >> 16777. The underlying mesh is not partitioned among >> MPI tasks using ParMETIS (unstructured multilevel graph partitioning) >> nor optimally by hand, but following the so-called >> z-shape space-filling curves provided by an underlying octree-like mesh >> handler (i.e., p4est library). >> >> I configured the preconditioned linear solver as follows: >> >> -ksp_type cg >> -ksp_monitor >> -ksp_rtol 1.0e-6 >> -ksp_converged_reason >> -ksp_max_it 500 >> -ksp_norm_type unpreconditioned >> -ksp_view >> -log_view >> >> -pc_type gamg >> -pc_gamg_type agg >> -mg_levels_esteig_ksp_type cg >> -mg_coarse_sub_pc_type cholesky >> -mg_coarse_sub_pc_factor_mat_ordering_type nd >> -pc_gamg_process_eq_limit 50 >> -pc_gamg_square_graph 0 >> -pc_gamg_agg_nsmooths 1 >> >> Raw timings (in seconds) of the preconditioner set up and PCG iterative >> solution stage, and number of iterations are as follows: >> >> **preconditioner set up** >> (load1): [0.02542160451, 0.05169247743, 0.09266782179, 0.2426272957, >> 13.64161944] >> (load2): [0.1239175797 , 0.1885528499 , 0.2719282564 , 0.4783878336, >> 13.37947339] >> (load3): [0.6565349903 , 0.9435049873 , 1.299908397 , 1.916243652 , >> 16.02904088] >> >> **PCG stage** >> (load1): [0.003287350759, 0.008163803257, 0.03565631993, 0.08343045413, >> 0.6937994603] >> (load2): [0.0205939794 , 0.03594723623 , 0.07593298424, 0.1212046621 >> , 0.6780373845] >> (load3): [0.1310882876 , 0.3214917686 , 0.5532023879 , >> 0.766881627 , 1.485446003] >> >> **number of PCG iterations** >> (load1): [5, 8, 11, 13, 13] >> (load2): [7, 10, 12, 13, 13] >> (load3): [8, 10, 12, 13, 13] >> >> It can be observed that both the number of linear solver iterations and >> the PCG stage timings (weakly) >> scale remarkably, but t*here is a significant time increase when scaling >> the problem from 2097 to 16777 MPI tasks * >> *for the preconditioner setup stage* (e.g., 1.916243652 vs 16.02904088 >> sec. with 40**3 cells per MPI task). >> I gathered the combined output of -ksp_view and -log_view (only) for all >> the points involving the load3 weak scaling >> test (find them attached to this message). Please note that within each >> run, I execute the these two stages up-to >> three times, and this influences absolute timings given in -log_view. >> >> Looking at the output of -log_view, it is very strange to me, e.g., that >> the stage labelled as "Graph" >> does not scale properly as it is just a call to MatDuplicate if the block >> size of the matrix is 1 (our case), and >> I guess that it is just a local operation that does not require any >> communication. >> What I am missing here? The load does not seem to be unbalanced looking >> at the "Ratio" column. >> >> I wonder whether the observed behaviour is as expected, or this a >> miss-configuration of the solver from our side. >> I played (quite a lot) with several parameter-value combinations, and the >> configuration above is the one that led to fastest >> execution (from the ones tested, that might be incomplete, I can also >> provide further feedback if helpful). >> Any feedback that we can get from your experience in order to find the >> cause(s) of this issue and a mitigating solution >> will be of high added value. >> >> Thanks very much in advance! >> Best regards, >> Alberto. >> >> -- >> Alberto F. Mart?n-Huertas >> Senior Researcher, PhD. Computational Science >> Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) >> Parc Mediterrani de la Tecnologia, UPC >> Esteve Terradas 5, Building C3, Office 215, >> 08860 Castelldefels (Barcelona, Spain) >> Tel.: (+34) 9341 34223e-mail:amartin at cimne.upc.edu >> >> FEMPAR project co-founder >> web: http://www.fempar.org >> >> ________________ >> IMPORTANT NOTICE >> All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in >> order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by >> letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. >> >> > -- > Alberto F. Mart?n-Huertas > Senior Researcher, PhD. Computational Science > Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) > Parc Mediterrani de la Tecnologia, UPC > Esteve Terradas 5, Building C3, Office 215, > 08860 Castelldefels (Barcelona, Spain) > Tel.: (+34) 9341 34223e-mail:amartin at cimne.upc.edu > > FEMPAR project co-founder > web: http://www.fempar.org > > ________________ > IMPORTANT NOTICE > All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in > order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by > letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amartin at cimne.upc.edu Thu Nov 8 06:08:09 2018 From: amartin at cimne.upc.edu (=?UTF-8?B?IkFsYmVydG8gRi4gTWFydMOtbiI=?=) Date: Thu, 08 Nov 2018 13:08:09 +0100 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue In-Reply-To: References: <5BE30C87.10204@cimne.upc.edu> <5BE420E4.8020800@cimne.upc.edu> Message-ID: <5BE42729.3090307@cimne.upc.edu> On 08/11/18 13:01, Matthew Knepley wrote: > On Thu, Nov 8, 2018 at 6:41 AM "Alberto F. Mart?n" via petsc-users > > wrote: > > Dear Mark, > > thanks for your quick and comprehensive reply. > > Before moving to the results of the experiments that u suggested, > let me clarify two points > on my original e-mail and your answer: > > (1) The raw timings and #iters. provided in my first e-mail were > actually > obtained with "-pc_gamg_square_graph 1" (and not 0); sorry > about that, my mistake. > (the logs, though, were consistent with the solver > configuration provided). > The raw figures with "-pc_gamg_square_graph 0" are actually > as follows: > > (load3): [0.25074561, 0.3650926566, 0.6251466936, > 0.8709517661, 15.52180776] > (load3): [0.148803731, 0.325266364, 0.5538515123, > 0.7537377281, 1.475100923] > (load3): [8, 9, 11, 12, 12] > > Bottom line: significant improvement of absolute times for > the first 4x problems, marginal improvement for > the largest problem (compared to > "-pc_gamg_square_graph 1") > > (2) < you report 16). // > // -pc_gamg_square_graph 10 should help that./>> > > This disagreement is justified by the following note on my > original e-mail: > > < two stages up-to// > // three times, and this influences absolute timings > given in -log_view./>> > > I tried new configurations based on your suggestions. Find > attached the results. > (legends indicate changes with respect to the solver configuration > provided > in my first e-mail). > > Bottom lines: (1) the configuration provided in my original e-mail > leads to fastest execution > and less number of iteration for the first 4x problems. (2) *The > (new) parameter-value combinations** > **suggested seem to have almost no impact into the preconditioner > set up time of the last problem.* > > > Mark, could this bad setup just be non-scalability in ParMetis? How do > we see the ParMetis time? > > Thanks, > > Matt Dear Matt, I did not configured PETSc with ParMetis support. Should I? I figured it out when I tried to use "-pc_gamg_repartition". PETSc complained that it was not compiled with ParMetis support. Thanks! Best regards, Alberto. > * > *I also tried HYPRE-BoomerAMG as suggested, with two different > configurations. > > *** SYMMETRIC CONFIGURATION *** > -ksp_type cg > -ksp_monitor > -ksp_rtol 1.0e-6 > -ksp_converged_reason > -ksp_max_it 500 > -ksp_norm_type unpreconditioned > -ksp_view > -log_view > > -pc_type hypre > -pc_hypre_type boomeramg > -pc_hypre_boomeramg_print_statistics 1 > -pc_hypre_boomeramg_strong_threshold 0.25 > -pc_hypre_boomeramg_coarsen_type HMIS > -pc_hypre_boomeramg_relax_type_down symmetric-SOR/Jacobi > -pc_hypre_boomeramg_relax_type_up symmetric-SOR/Jacobi > -pc_hypre_boomeramg_relax_type_coarse Gaussian-elimination > > *** UNSYMMETRIC CONFIGURATION *** > -ksp_type gmres > -ksp_gmres_restart 500 > -ksp_monitor > -ksp_rtol 1.0e-6 > -ksp_converged_reason > -ksp_max_it 500 > -ksp_pc_side right > -ksp_norm_type unpreconditioned > > -pc_type hypre > -pc_hypre_type boomeramg > -pc_hypre_boomeramg_print_statistics 1 > -pc_hypre_boomeramg_strong_threshold 0.25 > -pc_hypre_boomeramg_coarsen_type HMIS > -pc_hypre_boomeramg_relax_type_down SOR/Jacobi > -pc_hypre_boomeramg_relax_type_up SOR/Jacobi > -pc_hypre_boomeramg_relax_type_coarse Gaussian-elimination > > The raw results were: > > *** SYMMETRIC CONFIGURATION *** > > (load3): [0.1828534687, 0.3055133289, 0.3582984209, 0.4280304033, > 1.343549139] > (load3): [0.2102472978, 0.4572948301, 0.7153297188, 0.9989531627, > N/A] > (load3): [19, 23, 26, 28, 'DIVERGED_INDEFINITE_PC'] > > *** UNSYMMETRIC CONFIGURATION *** > > (load3): [0.1841227429, 0.3082743008, 0.3652294828, 0.4654760892, > 1.331299786] > (load3): [0.1194557019, 0.2830136018, 0.5046830242, 1.363314636, N/A] > (load3): [15, 19, 24, 48, DIVERGED_ITS] > > Thus, the largest problem also seems to cause (even more severe) > issues to HYPRE, in particular, > INDEFINITE PRECONDITIONER with CG, and not convergence within 500 > iterations for GMRES. > The preconditioner set up stage time, though, scales reasonably > well with the same data distribution > that we used to feed GAMG (although the preconditioner computed > for the largest problem seems to be > totally useless). > > I have logs for all these results if required. > > Thanks for your help! > Best regards, > Alberto. > > > > On 07/11/18 19:46, Mark Adams wrote: >> First I would add -gamg_est_ksp_type cg >> >> You seem to be converging well so I assume you are setting the >> null space for GAMG. >> >> Note, you should test hypre also. >> >> You probably want a bigger "-pc_gamg_process_eq_limit 50". 200 at >> least but you test your machine with a range on the largest >> problem. This is a parameter for reducing the number of active >> processors (on coarse grids). >> >> I would only worry about "load3". This has 16K equations per >> process, which is where you start noticing "strong scaling" >> problems, depending on the machine. >> >> An important parameter is "-pc_gamg_square_graph 0". I would >> probably start with infinity (eg, 10). >> >> Now, I'm not sure about your domain, problem sizes, and thus the >> weak scaling design. You seem to be scaling on the background >> mesh, but that may not be a good proxy for complexity. >> >> You can look at the number of flops and scale it appropriately by >> the number of solver iterations to get a relative size of the >> problem. I would recommend scaling the number of processors with >> this. For instance here the MatMult line for the 4 proc and 16K >> proc run: >> >> ------------------------------------------------------------------------------------------------------------------------ >> Event Count Time (sec) Flop >> --- Global --- --- Stage --- Total >> Max Ratio Max Ratio Max Ratio Mess Avg len >> Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s >> ------------------------------------------------------------------------------------------------------------------------ >> MatMult 636 1.0 1.9035e-01 1.0 3.12e+08 1.1 7.6e+03 >> 3.0e+03 0.0e+00 0 47 62 44 0 0 47 62 44 0 6275 [2 procs] >> MatMult 1416 1.0 1.9601e+002744.6 4.82e+08 0.0 4.3e+08 >> 7.2e+02 0.0e+00 0 48 50 48 0 0 48 50 48 0 2757975 [16K procs] >> >> Now, you have empty processors. See the massive load imbalance on >> time and the zero on Flops. The "Ratio" is max/min and cleary >> min=0 so PETSc reports a ratio of 0 (it is infinity really). >> >> Also, weak scaling on a thin body (I don't know your domain) is a >> little funny because as the problem scales up the mesh becomes >> more 3D and this causes the cost per equation to go up. That is >> why I prefer to use the number of non-zeros as the processor >> scaling function but number of equations is easier ... >> >> The PC setup times are large (I see 48 seconds at 16K bu you >> report 16). -pc_gamg_square_graph 10 should help that. >> >> The max number of flops per processor in MatMult goes up by 50% >> and the max time goes up by 10x and the number of iterations goes >> up by 13/8. If I put all of this together I get that 75% of the >> time at 16K is in communication at 16K. I think that and the >> absolute time can be improved some by optimizing parameters as >> I've suggested. >> >> Mark >> >> >> >> >> >> On Wed, Nov 7, 2018 at 11:03 AM "Alberto F. Mart?n" via >> petsc-users > > wrote: >> >> Dear All, >> >> we are performing a weak scaling test of the PETSc (v3.9.0) >> GAMG preconditioner when applied to the linear system arising >> from the *conforming unfitted FE discretization *(using Q1 >> Lagrangian FEs) of a 3D PDE Poisson problem, where >> the boundary of the domain (a popcorn flake) is described as >> a zero-level-set embedded within a uniform background >> (Cartesian-like) hexahedral mesh. Details underlying the FEM >> formulation can be made available on demand if you >> believe that this might be helpful, but let me just point out >> that it is designed such that it addresses the well-known >> ill-conditioning issues of unfitted FE discretizations due to >> the small cut cell problem. >> >> The weak scaling test is set up as follows. We start from a >> single cube background mesh, and refine it uniformly several >> steps, until we have approximately either 10**3 (load1), >> 20**3 (load2), or 40**3 (load3) hexahedra/MPI task when >> distributing it over 4 MPI tasks. The benchmark is scaled >> such that the next larger scale problem to be tested is obtained >> by uniformly refining the mesh from the previous scale and >> running it on 8x times the number of MPI tasks that we used >> in the previous scale. As a result, we obtain three weak >> scaling curves for each of the three fixed loads per MPI task >> above, on the following total number of MPI tasks: 4, 32, >> 262, 2097, 16777. The underlying mesh is not partitioned among >> MPI tasks using ParMETIS (unstructured multilevel graph >> partitioning) nor optimally by hand, but following the >> so-called >> z-shape space-filling curves provided by an underlying >> octree-like mesh handler (i.e., p4est library). >> >> I configured the preconditioned linear solver as follows: >> >> -ksp_type cg >> -ksp_monitor >> -ksp_rtol 1.0e-6 >> -ksp_converged_reason >> -ksp_max_it 500 >> -ksp_norm_type unpreconditioned >> -ksp_view >> -log_view >> >> -pc_type gamg >> -pc_gamg_type agg >> -mg_levels_esteig_ksp_type cg >> -mg_coarse_sub_pc_type cholesky >> -mg_coarse_sub_pc_factor_mat_ordering_type nd >> -pc_gamg_process_eq_limit 50 >> -pc_gamg_square_graph 0 >> -pc_gamg_agg_nsmooths 1 >> >> Raw timings (in seconds) of the preconditioner set up and PCG >> iterative solution stage, and number of iterations are as >> follows: >> >> **preconditioner set up** >> (load1): [0.02542160451, 0.05169247743, 0.09266782179, >> 0.2426272957, 13.64161944] >> (load2): [0.1239175797 , 0.1885528499 , 0.2719282564 , >> 0.4783878336, 13.37947339] >> (load3): [0.6565349903 , 0.9435049873 , 1.299908397 , >> 1.916243652 , 16.02904088] >> >> **PCG stage** >> (load1): [0.003287350759, 0.008163803257, 0.03565631993, >> 0.08343045413, 0.6937994603] >> (load2): [0.0205939794 , 0.03594723623 , 0.07593298424, >> 0.1212046621 , 0.6780373845] >> (load3): [0.1310882876 , 0.3214917686 , 0.5532023879 , >> 0.766881627 , 1.485446003] >> >> **number of PCG iterations** >> (load1): [5, 8, 11, 13, 13] >> (load2): [7, 10, 12, 13, 13] >> (load3): [8, 10, 12, 13, 13] >> >> It can be observed that both the number of linear solver >> iterations and the PCG stage timings (weakly) >> scale remarkably, but t*here is a significant time increase >> when scaling the problem from 2097 to 16777 MPI tasks ** >> **for the preconditioner setup stage* (e.g., 1.916243652 vs >> 16.02904088 sec. with 40**3 cells per MPI task). >> I gathered the combined output of -ksp_view and -log_view >> (only) for all the points involving the load3 weak scaling >> test (find them attached to this message). Please note that >> within each run, I execute the these two stages up-to >> three times, and this influences absolute timings given in >> -log_view. >> >> Looking at the output of -log_view, it is very strange to me, >> e.g., that the stage labelled as "Graph" >> does not scale properly as it is just a call to MatDuplicate >> if the block size of the matrix is 1 (our case), and >> I guess that it is just a local operation that does not >> require any communication. >> What I am missing here? The load does not seem to be >> unbalanced looking at the "Ratio" column. >> >> I wonder whether the observed behaviour is as expected, or >> this a miss-configuration of the solver from our side. >> I played (quite a lot) with several parameter-value >> combinations, and the configuration above is the one that led >> to fastest >> execution (from the ones tested, that might be incomplete, I >> can also provide further feedback if helpful). >> Any feedback that we can get from your experience in order to >> find the cause(s) of this issue and a mitigating solution >> will be of high added value. >> >> Thanks very much in advance! >> Best regards, >> Alberto. >> >> -- >> Alberto F. Mart?n-Huertas >> Senior Researcher, PhD. Computational Science >> Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) >> Parc Mediterrani de la Tecnologia, UPC >> Esteve Terradas 5, Building C3, Office 215, >> 08860 Castelldefels (Barcelona, Spain) >> Tel.: (+34) 9341 34223 >> e-mail:amartin at cimne.upc.edu >> >> FEMPAR project co-founder >> web:http://www.fempar.org >> >> ________________ >> IMPORTANT NOTICE >> All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in >> order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by >> letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. >> > > -- > Alberto F. Mart?n-Huertas > Senior Researcher, PhD. Computational Science > Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) > Parc Mediterrani de la Tecnologia, UPC > Esteve Terradas 5, Building C3, Office 215, > 08860 Castelldefels (Barcelona, Spain) > Tel.: (+34) 9341 34223 > e-mail:amartin at cimne.upc.edu > > FEMPAR project co-founder > web:http://www.fempar.org > > ________________ > IMPORTANT NOTICE > All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in > order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by > letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -- Alberto F. Mart?n-Huertas Senior Researcher, PhD. Computational Science Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) Parc Mediterrani de la Tecnologia, UPC Esteve Terradas 5, Building C3, Office 215, 08860 Castelldefels (Barcelona, Spain) Tel.: (+34) 9341 34223 e-mail:amartin at cimne.upc.edu FEMPAR project co-founder web: http://www.fempar.org ________________ IMPORTANT NOTICE All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yjwu16 at gmail.com Thu Nov 8 09:06:29 2018 From: yjwu16 at gmail.com (Yingjie Wu) Date: Thu, 8 Nov 2018 23:06:29 +0800 Subject: [petsc-users] How to fix max linear steps in SNES Message-ID: Dear Petsc developer: Hi, I recently debugged my program, which is a two-dimensional nonlinear PDEs problem, and solved by SNES. I find that the residual drop in KSP is slow. I want to fix the number of steps in the linear step, because I can not choose a suitable ksp_rtol. I use the command: -ksp_max_it 200, and I want to fix the number of iterations per ksp. But the program seemed to stop in the first nonlinear step. Linear solve did not converge due to DIVERGED_ITS iterations 200 Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 How can I fix the maximum number of iterations of linear steps without stopping the program before the convergence of the non-linear steps? Thanks, Yingjie -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Nov 8 10:29:33 2018 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 8 Nov 2018 11:29:33 -0500 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue In-Reply-To: <5BE42729.3090307@cimne.upc.edu> References: <5BE30C87.10204@cimne.upc.edu> <5BE420E4.8020800@cimne.upc.edu> <5BE42729.3090307@cimne.upc.edu> Message-ID: > > > I did not configured PETSc with ParMetis support. Should I? > > I figured it out when I tried to use "-pc_gamg_repartition". PETSc > complained that it was not compiled with ParMetis support. > You need ParMetis, or some parallel mesh partitioner, configured to use repartitioning. I would guess that "-pc_gamg_repartition" would not help and might hurt, because it just does the coarse grids, not the fine grid. But it is worth a try. Just configure with --download-parmetis The problem is that you are using space filling curves on the background grid and are getting empty processors. Right? The mesh setup phase is not super optimized, but your times And you said in your attachment that you added the near null space, but just the constant vector. I trust you mean the three translational rigid body modes. That is the default and so you should not see any difference. If you added one vector of all 1s then that would be bad. You also want the rotational rigid body modes. Now, you are converging pretty well and if your solution does not have much rotation in it the the rotational modes are not needed, but they are required for optimality in general. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Nov 8 10:32:04 2018 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 8 Nov 2018 11:32:04 -0500 Subject: [petsc-users] [petsc-maint] Correct use of PCFactorSetMatOrderingType In-Reply-To: References: Message-ID: To repeat: You seem to be saying that OpenFOAM solves the problem in 10 seconds and PETSc solves it in 14 seconds. Is that correct? On Thu, Nov 8, 2018 at 3:42 AM Edoardo alinovi via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hello Mark, > > Yes, there are 5 KSP calls within a time-step (3 for the solution of > momentum equation + 2 for the solution of pressure), this is the classical > non iterative PISO by Issa ( the exact sequence of operations is : solve > momentum implicitly, solve pressure-correction, momentum explicitly, > pressure correction). The pressure correction equation ,which is something > similar to a Poisson equation for incompressible flows, is the one that > determines the overall performance in my code such as in the others. > Usually, when the pressure is being solved for the second time, the > solution is faster since there is a better input guess and, as in my case, > the preconditioner is not recomputed again. > > Have you got some advices for the multigrid configuration in this > scenario, which are not the default one, in order to increase performances? > > I do not know if this may impact drastically the performance, but I am > running on a E4 workstation with 16 Intel's Xeon processors (2.3GH/12MB > cache) and 128GB of RAM . > > Thank you very much for your helpful comments, > > > Edoardo > ------ > > Edoardo Alinovi, Ph.D. > > DICCA, Scuola Politecnica > Universita' di Genova > 1, via Montallegro > 16145 Genova, Italy > > email: edoardo.alinovi at dicca.unige.it > Tel: +39 010 353 2540 > > > > > Il giorno mer 7 nov 2018 alle ore 17:59 Mark Adams ha > scritto: > >> please respond to petsc-users. >> >> You are doing 5 solves here in 14 seconds. You seem to be saying that the >> two pressure solves are taking all of this time. I don't know why the two >> solves are different. >> >> You seem to be saying that OpenFOAM solves the problem in 10 seconds and >> PETSc solves it in 14 seconds. Is that correct? Hypre seems to be running >> fine. >> >> >> >> On Wed, Nov 7, 2018 at 11:24 AM Edoardo alinovi < >> edoardo.alinovi at gmail.com> wrote: >> >>> Thanks a lot Mark for your kind replay. The solver is mine and I use >>> PETSc for the solution of momentum and pressure. The first is solved very >>> fast by a standard bcgs + bjacobi, but the pressure is the source of all >>> evils and, unfortunately, I am pretty sure that almost all the time within >>> the time-step is needed by KSP to solve the pressure (see log attached). I >>> have verified this also putting a couple of mpi_wtime around the kspsolve >>> call. The pressure is solved 2 times (1 prediction + 1 correction), the >>> prediction takes around 11s , the correction around 4s (here I am avoiding >>> to recompute the preconditioner), all the rest of the code (flux assembling >>> + mometum solution + others) around 1s. Openfoam does the same procedure >>> with the same tolerance in 10s using its gamg version (50 it to converge). >>> The number of iteration required to solve the pressure with hypre are 12. >>> Gamg performs similarly to hypre in terms of speed, but with 50 iterations >>> to converge. Am I missing something in the setup in your opinion? >>> >>> thanks a lot, >>> >>> Edo >>> >>> ------ >>> >>> Edoardo Alinovi, Ph.D. >>> >>> DICCA, Scuola Politecnica >>> Universita' di Genova >>> 1, via Montallegro >>> 16145 Genova, Italy >>> >>> email: edoardo.alinovi at dicca.unige.it >>> Tel: +39 010 353 2540 >>> >>> >>> >>> >>> Il giorno mer 7 nov 2018 alle ore 16:50 Mark Adams ha >>> scritto: >>> >>>> You can try -pc_type gamg, but hypre is a pretty good solver for the >>>> Laplacian. If hypre is just a little faster than LU on a 3D problem (that >>>> takes 10 seconds to solve) then AMG is not doing well. I would expect that >>>> AMG is taking a lot of iterations (eg, >> 10). You can check that with >>>> -ksp_monitor. >>>> >>>> The PISO algorithm is a multistage algorithm with a pressure correction >>>> in it. It also has a solve for the velocity, from what I can tell. Are you >>>> building PISO yourself and using PETSc just for the pressure correction? >>>> Are you sure the time is spent in this solver? You can use -log_view to see >>>> performance numbers and look for KSPSolve to see how much time is spent in >>>> the PETSc solver. >>>> >>>> Mark >>>> >>>> >>>> On Wed, Nov 7, 2018 at 10:26 AM Zhang, Hong via petsc-maint < >>>> petsc-maint at mcs.anl.gov> wrote: >>>> >>>>> Edoardo: >>>>> Forwarding your request to petsc-maint where you can get fast and >>>>> expert advise. I do not have suggestion for your application, but someone >>>>> in our team likely will make suggestion. >>>>> Hong >>>>> >>>>> Hello Hong, >>>>>> >>>>>> Well, using -sub_pc_type lu it super slow. I am desperately triying >>>>>> to enhance performaces of my code (CFD, finite volume, PISO alghoritm), in >>>>>> particular I have a strong bottleneck in the solution of pressure >>>>>> correction equation which takes almost the 90% of computational time. Using >>>>>> multigrid as preconditoner (hypre with default options) is slighlty >>>>>> better, but comparing the results against the multigrid used in openFOAM, >>>>>> my code is losing 10s/iteration which a huge amount of time. Now, since >>>>>> that all the time is employed by KSPSolve, I feel a bit powerless. Do you >>>>>> have any helpful advice? >>>>>> >>>>>> Thank you very much! >>>>>> ------ >>>>>> >>>>>> Edoardo Alinovi, Ph.D. >>>>>> >>>>>> DICCA, Scuola Politecnica >>>>>> Universita' di Genova >>>>>> 1, via Montallegro >>>>>> 16145 Genova, Italy >>>>>> >>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>> Tel: +39 010 353 2540 >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> Il giorno mar 6 nov 2018 alle ore 17:15 Zhang, Hong < >>>>>> hzhang at mcs.anl.gov> ha scritto: >>>>>> >>>>>>> Edoardo: >>>>>>> Interesting. I thought it would not affect performance much. What >>>>>>> happens if you use -sub_pc_type lu'? >>>>>>> Hong >>>>>>> >>>>>>> Dear Hong and Matt, >>>>>>>> >>>>>>>> thank you for your kind replay. I have just tested your suggestions >>>>>>>> and applied " -sub_pc_type ilu -sub_pc_factor_mat_ordering_type nd/rcm" >>>>>>>> and, in both cases, I have found a deterioration of performances >>>>>>>> with respect to doing nothing (thus just putting default PCBJACOBI). Is it >>>>>>>> normal? However, I guess this is very problem dependent. >>>>>>>> ------ >>>>>>>> >>>>>>>> Edoardo Alinovi, Ph.D. >>>>>>>> >>>>>>>> DICCA, Scuola Politecnica >>>>>>>> Universita' di Genova >>>>>>>> 1, via Montallegro >>>>>>>> 16145 Genova, Italy >>>>>>>> >>>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>>> Tel: +39 010 353 2540 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> Il giorno mar 6 nov 2018 alle ore 16:04 Zhang, Hong < >>>>>>>> hzhang at mcs.anl.gov> ha scritto: >>>>>>>> >>>>>>>>> Edoardo: >>>>>>>>> You can test runtime option '-sub_pc_factor_mat_ordering_type' and >>>>>>>>> use '-log_view' to get performance on different orderings, >>>>>>>>> e.g.,petsc/src/ksp/ksp/examples/tutorials/ex2.c: >>>>>>>>> mpiexec -n 2 ./ex2 -ksp_view -sub_pc_type ilu >>>>>>>>> -sub_pc_factor_mat_ordering_type nd >>>>>>>>> >>>>>>>>> I do not think the ordering inside block for ilu would affect >>>>>>>>> performance much. Let us know what you will get. >>>>>>>>> Hong >>>>>>>>> >>>>>>>>> Dear users, >>>>>>>>>> >>>>>>>>>> I have a question about the correct use of the option >>>>>>>>>> "PCFactorSetMatOrderingType" in PETSc. >>>>>>>>>> >>>>>>>>>> I am solving a problem with 2.5M of unknowns distributed along 16 >>>>>>>>>> processors and I am using the block jacobi preconditioner and MPIAIJ >>>>>>>>>> matrix format. I cannot figure out if the above option can be useful or not >>>>>>>>>> in decreasing the computational time. Any suggestion or tips? >>>>>>>>>> >>>>>>>>>> Thank you very much for the kind help >>>>>>>>>> >>>>>>>>>> ------ >>>>>>>>>> >>>>>>>>>> Edoardo Alinovi, Ph.D. >>>>>>>>>> >>>>>>>>>> DICCA, Scuola Politecnica >>>>>>>>>> Universita' di Genova >>>>>>>>>> 1, via Montallegro >>>>>>>>>> 16145 Genova, Italy >>>>>>>>>> >>>>>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>>>>> Tel: +39 010 353 2540 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From tempohoper at gmail.com Thu Nov 8 10:41:58 2018 From: tempohoper at gmail.com (Sal Am) Date: Thu, 8 Nov 2018 16:41:58 +0000 Subject: [petsc-users] Vec, Mat and binaryfiles. In-Reply-To: References: <87r2fzpj0m.fsf@jedbrown.org> <87muqkna5e.fsf@jedbrown.org> Message-ID: Yes I was just hoping there'd be more than that. I have tried using one of them as basis: import PetscBinaryIO import numpy as np import scipy.sparse b_vector = np.array(np.fromfile('Vector_b.bin' ,dtype=np.dtype((np.float64,2)))); A_matrix = np.array(np.fromfile('Matrix_A_cmplx.bin',dtype=np.dtype((np.float64,2)))); A_indexs = np.array(np.fromfile('Matrix_A_int.bin' ,dtype=np.dtype((np.int32 ,2)))); b_vector = b_vector[:,0] + 1j*b_vector[:,1] A_matrix = A_matrix[:,0] + 1j*A_matrix[:,1] A_matrix = scipy.sparse.csr_matrix((A_matrix, (A_indexs[:,0]-1, A_indexs[:,1]-1))) vec = b_vector.view(PetscBinaryIO.Vec) io = PetscBinaryIO.PetscBinaryIO() io.writeBinaryFile('Vector_b.dat', [vec,]) io.writeMatSciPy('Matrix_A.dat', A_matrix) I was able to do it convert and read the Vector_b in fine. So what I am having trouble with is actually converting the scipy.sparse.csr_matrix into the PETSc format. I thought the writeMatSciPy would do that by calling writeMatSparse? Maybe I have misunderstood something here? All the best, S On Thu, Nov 8, 2018 at 10:10 AM Matthew Knepley wrote: > On Thu, Nov 8, 2018 at 4:56 AM Sal Am via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Thanks, I missed that, >> >> I do care about scalability, the long term plan is to make it fully >> compatible on a cluster so I guess I would need to convert it to PETSc >> binary format before reading it in. >> >> Is this the file you were referring to >> https://github.com/erdc/petsc-dev/blob/master/bin/pythonscripts/PetscBinaryIO.py >> >> Cannot seem to find docs on how to use it, do you happen to have any >> examples? >> > > The documentation is at the top of the file. You can see it by clicking > the link. > > Matt > > >> All the best, >> S >> >> >> >> >> On Wed, Nov 7, 2018 at 4:07 PM Jed Brown wrote: >> >>> Please always use "reply-all" so that your messages go to the list. >>> This is standard mailing list etiquette. It is important to preserve >>> threading for people who find this discussion later and so that we do >>> not waste our time re-answering the same questions that have already >>> been answered in private side-conversations. You'll likely get an >>> answer faster that way too. >>> >>> Sal Am writes: >>> >>> > Thank you Jed for the quick response! >>> > >>> >> >>> >> Yes, of course the formats would have to match. I would recommend >>> >> writing the files in an existing format such as PETSc's binary format. >>> >> >>> > >>> > Unfortunately I do not think I can change the source code to output >>> those >>> > two files in PETSc format and it would probably take a very long time >>> > converting everything into PETSc (it is not even my code). >>> >>> File-based workflows are very often bottlenecks. You can also use any >>> convenient software (e.g., Python or Matlab/Octave) to convert your >>> custom binary formats to PETSc binary format (see PetscBinaryIO provided >>> with PETSc), at which point you'll be able to read in parallel. If you >>> don't care about scalability or only need to read once, then you can >>> write code of the type you propose. >>> >>> > Do you have any other suggestions on how to read in those two complex >>> > binary files? Also why would it be more difficult to parallelise as >>> > I thought getting the two files in PETSc vector format would allow me >>> to >>> > use the rest of PETSc library the usual way? >>> > >>> > Kind regards, >>> > S >>> > >>> > >>> > On Mon, Nov 5, 2018 at 4:49 PM Jed Brown wrote: >>> > >>> >> Sal Am via petsc-users writes: >>> >> >>> >> > Hi, >>> >> > >>> >> > I am trying to solve a Ax=b complex system. the vector b and >>> "matrix" A >>> >> are >>> >> > both binary and NOT created by PETSc. So I keep getting error >>> messages >>> >> that >>> >> > they are not correct format when I read the files with >>> >> PetscViewBinaryOpen, >>> >> > after some digging it seems that one cannot just read a binary file >>> that >>> >> > was created by another software. >>> >> >>> >> Yes, of course the formats would have to match. I would recommend >>> >> writing the files in an existing format such as PETSc's binary format. >>> >> While the method you describe can be made to work, it will be more >>> work >>> >> to make it parallel. >>> >> >>> >> > How would I go on to solve this problem? >>> >> > >>> >> > More info and trials: >>> >> > >>> >> > "matrix" A consists of two files, one that contains row column index >>> >> > numbers and one that contains the non-zero values. So what I would >>> have >>> >> to >>> >> > do is multiply the last term in a+b with PETSC_i to get a real + >>> >> imaginary >>> >> > vector A. >>> >> > >>> >> > vector b is in binary, so what I have done so far (not sure if it >>> works) >>> >> is: >>> >> > >>> >> > std::ifstream input("Vector_b.bin", std::ios::binary ); >>> >> > while (input.read(reinterpret_cast(&v), sizeof(float))) >>> >> > ierr = >>> VecSetValues(u,1,&iglobal,&v,INSERT_VALUES);CHKERRQ(ierr); >>> >> > >>> >> > where v is a PetscScalar. >>> >> > >>> >> > Once I am able to read both matrices I think I can figure out the >>> solvers >>> >> > to solve the system. >>> >> > >>> >> > All the best, >>> >> > S >>> >> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Thu Nov 8 10:56:39 2018 From: dave.mayhem23 at gmail.com (Dave May) Date: Thu, 8 Nov 2018 17:56:39 +0100 Subject: [petsc-users] need help with vector interpolation on nonuniform DMDA grids In-Reply-To: References: <27356998-760A-431D-BB28-9FA41CA0ED40@fastwebnet.it> <669D57CD-9870-4A79-94C9-5258442FEF4E@fastwebnet.it> Message-ID: On Thu, 8 Nov 2018 at 09:43, Francesco Magaletti < francesco_magaletti at fastwebnet.it> wrote: > Dear Matt & Dave, > I really appreciate your support! > > I was not aware of the existence of DMSwarm object and it has been a > really exiting discovery, since in our research group we have people > working with kind of PIC methods and I suppose that they will be grateful > for your work Dave! If you try it out, I welcome feedback (good or bad). > > Concerning my problem, I will give your suggestions a try! I?ll write you > back if I miss some technical points with using the new features of DMSwarm. Good. I'd like to hear how it goes. Thanks, Dave > > Thank you again, > Francesco -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 8 11:03:11 2018 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 8 Nov 2018 18:03:11 +0100 Subject: [petsc-users] [petsc-maint] Correct use of PCFactorSetMatOrderingType In-Reply-To: References: Message-ID: Yes, it is like you are saying. This is mostly due to the time employed by ksp to solve the pressure equation. However, I have worked a lot on the problem and I have found out that the default configuration is far to be the optimal one, at least in this case. Actually my cpu time is decreased by more than twice with respect to the default configuration. I put here the changes, maybe they can be usefull to other users in the future: -pc_hypre_boomeramg_no_CF -pc_hypre_boomeramg_agg_nl 1 pc_hypre_boomeramg_coarsen_type HMIS -pc_hypre_boomeramg_interp_type FF1 The last two seem to be essential in enhancing performances. Do you know other configurations that worth to test? Also, I would like to know if a list of command options for hypre is available somewhere. Looking at hypre's doc there are a lot of options, but I do not exactly know which one is available in petsc and their name. Thank you very much for the kind support and sorry for the long emal.! Il giorno gio 8 nov 2018, 17:32 Mark Adams ha scritto: > To repeat: > > You seem to be saying that OpenFOAM solves the problem in 10 seconds and > PETSc solves it in 14 seconds. Is that correct? > > > > On Thu, Nov 8, 2018 at 3:42 AM Edoardo alinovi via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Hello Mark, >> >> Yes, there are 5 KSP calls within a time-step (3 for the solution of >> momentum equation + 2 for the solution of pressure), this is the classical >> non iterative PISO by Issa ( the exact sequence of operations is : solve >> momentum implicitly, solve pressure-correction, momentum explicitly, >> pressure correction). The pressure correction equation ,which is something >> similar to a Poisson equation for incompressible flows, is the one that >> determines the overall performance in my code such as in the others. >> Usually, when the pressure is being solved for the second time, the >> solution is faster since there is a better input guess and, as in my case, >> the preconditioner is not recomputed again. >> >> Have you got some advices for the multigrid configuration in this >> scenario, which are not the default one, in order to increase performances? >> >> I do not know if this may impact drastically the performance, but I am >> running on a E4 workstation with 16 Intel's Xeon processors (2.3GH/12MB >> cache) and 128GB of RAM . >> >> Thank you very much for your helpful comments, >> >> >> Edoardo >> ------ >> >> Edoardo Alinovi, Ph.D. >> >> DICCA, Scuola Politecnica >> Universita' di Genova >> 1, via Montallegro >> 16145 Genova, Italy >> >> email: edoardo.alinovi at dicca.unige.it >> Tel: +39 010 353 2540 >> >> >> >> >> Il giorno mer 7 nov 2018 alle ore 17:59 Mark Adams ha >> scritto: >> >>> please respond to petsc-users. >>> >>> You are doing 5 solves here in 14 seconds. You seem to be saying that >>> the two pressure solves are taking all of this time. I don't know why the >>> two solves are different. >>> >>> You seem to be saying that OpenFOAM solves the problem in 10 seconds and >>> PETSc solves it in 14 seconds. Is that correct? Hypre seems to be running >>> fine. >>> >>> >>> >>> On Wed, Nov 7, 2018 at 11:24 AM Edoardo alinovi < >>> edoardo.alinovi at gmail.com> wrote: >>> >>>> Thanks a lot Mark for your kind replay. The solver is mine and I use >>>> PETSc for the solution of momentum and pressure. The first is solved very >>>> fast by a standard bcgs + bjacobi, but the pressure is the source of all >>>> evils and, unfortunately, I am pretty sure that almost all the time within >>>> the time-step is needed by KSP to solve the pressure (see log attached). I >>>> have verified this also putting a couple of mpi_wtime around the kspsolve >>>> call. The pressure is solved 2 times (1 prediction + 1 correction), the >>>> prediction takes around 11s , the correction around 4s (here I am avoiding >>>> to recompute the preconditioner), all the rest of the code (flux assembling >>>> + mometum solution + others) around 1s. Openfoam does the same procedure >>>> with the same tolerance in 10s using its gamg version (50 it to converge). >>>> The number of iteration required to solve the pressure with hypre are 12. >>>> Gamg performs similarly to hypre in terms of speed, but with 50 iterations >>>> to converge. Am I missing something in the setup in your opinion? >>>> >>>> thanks a lot, >>>> >>>> Edo >>>> >>>> ------ >>>> >>>> Edoardo Alinovi, Ph.D. >>>> >>>> DICCA, Scuola Politecnica >>>> Universita' di Genova >>>> 1, via Montallegro >>>> 16145 Genova, Italy >>>> >>>> email: edoardo.alinovi at dicca.unige.it >>>> Tel: +39 010 353 2540 >>>> >>>> >>>> >>>> >>>> Il giorno mer 7 nov 2018 alle ore 16:50 Mark Adams >>>> ha scritto: >>>> >>>>> You can try -pc_type gamg, but hypre is a pretty good solver for the >>>>> Laplacian. If hypre is just a little faster than LU on a 3D problem (that >>>>> takes 10 seconds to solve) then AMG is not doing well. I would expect that >>>>> AMG is taking a lot of iterations (eg, >> 10). You can check that with >>>>> -ksp_monitor. >>>>> >>>>> The PISO algorithm is a multistage algorithm with a pressure >>>>> correction in it. It also has a solve for the velocity, from what I can >>>>> tell. Are you building PISO yourself and using PETSc just for the pressure >>>>> correction? Are you sure the time is spent in this solver? You can use >>>>> -log_view to see performance numbers and look for KSPSolve to see how much >>>>> time is spent in the PETSc solver. >>>>> >>>>> Mark >>>>> >>>>> >>>>> On Wed, Nov 7, 2018 at 10:26 AM Zhang, Hong via petsc-maint < >>>>> petsc-maint at mcs.anl.gov> wrote: >>>>> >>>>>> Edoardo: >>>>>> Forwarding your request to petsc-maint where you can get fast and >>>>>> expert advise. I do not have suggestion for your application, but someone >>>>>> in our team likely will make suggestion. >>>>>> Hong >>>>>> >>>>>> Hello Hong, >>>>>>> >>>>>>> Well, using -sub_pc_type lu it super slow. I am >>>>>>> desperately triying to enhance performaces of my code (CFD, finite volume, >>>>>>> PISO alghoritm), in particular I have a strong bottleneck in the solution >>>>>>> of pressure correction equation which takes almost the 90% of computational >>>>>>> time. Using multigrid as preconditoner (hypre with default options) is >>>>>>> slighlty better, but comparing the results against the multigrid used in >>>>>>> openFOAM, my code is losing 10s/iteration which a huge amount of time. Now, >>>>>>> since that all the time is employed by KSPSolve, I feel a bit powerless. >>>>>>> Do you have any helpful advice? >>>>>>> >>>>>>> Thank you very much! >>>>>>> ------ >>>>>>> >>>>>>> Edoardo Alinovi, Ph.D. >>>>>>> >>>>>>> DICCA, Scuola Politecnica >>>>>>> Universita' di Genova >>>>>>> 1, via Montallegro >>>>>>> 16145 Genova, Italy >>>>>>> >>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>> Tel: +39 010 353 2540 >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> Il giorno mar 6 nov 2018 alle ore 17:15 Zhang, Hong < >>>>>>> hzhang at mcs.anl.gov> ha scritto: >>>>>>> >>>>>>>> Edoardo: >>>>>>>> Interesting. I thought it would not affect performance much. What >>>>>>>> happens if you use -sub_pc_type lu'? >>>>>>>> Hong >>>>>>>> >>>>>>>> Dear Hong and Matt, >>>>>>>>> >>>>>>>>> thank you for your kind replay. I have just tested your >>>>>>>>> suggestions and applied " -sub_pc_type ilu -sub_pc_factor_mat_ordering_type >>>>>>>>> nd/rcm" and, in both cases, I have found a deterioration of >>>>>>>>> performances with respect to doing nothing (thus just putting default >>>>>>>>> PCBJACOBI). Is it normal? However, I guess this is very problem dependent. >>>>>>>>> ------ >>>>>>>>> >>>>>>>>> Edoardo Alinovi, Ph.D. >>>>>>>>> >>>>>>>>> DICCA, Scuola Politecnica >>>>>>>>> Universita' di Genova >>>>>>>>> 1, via Montallegro >>>>>>>>> 16145 Genova, Italy >>>>>>>>> >>>>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>>>> Tel: +39 010 353 2540 >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> Il giorno mar 6 nov 2018 alle ore 16:04 Zhang, Hong < >>>>>>>>> hzhang at mcs.anl.gov> ha scritto: >>>>>>>>> >>>>>>>>>> Edoardo: >>>>>>>>>> You can test runtime option '-sub_pc_factor_mat_ordering_type' >>>>>>>>>> and use '-log_view' to get performance on different orderings, >>>>>>>>>> e.g.,petsc/src/ksp/ksp/examples/tutorials/ex2.c: >>>>>>>>>> mpiexec -n 2 ./ex2 -ksp_view -sub_pc_type ilu >>>>>>>>>> -sub_pc_factor_mat_ordering_type nd >>>>>>>>>> >>>>>>>>>> I do not think the ordering inside block for ilu would affect >>>>>>>>>> performance much. Let us know what you will get. >>>>>>>>>> Hong >>>>>>>>>> >>>>>>>>>> Dear users, >>>>>>>>>>> >>>>>>>>>>> I have a question about the correct use of the option >>>>>>>>>>> "PCFactorSetMatOrderingType" in PETSc. >>>>>>>>>>> >>>>>>>>>>> I am solving a problem with 2.5M of unknowns distributed along >>>>>>>>>>> 16 processors and I am using the block jacobi preconditioner and MPIAIJ >>>>>>>>>>> matrix format. I cannot figure out if the above option can be useful or not >>>>>>>>>>> in decreasing the computational time. Any suggestion or tips? >>>>>>>>>>> >>>>>>>>>>> Thank you very much for the kind help >>>>>>>>>>> >>>>>>>>>>> ------ >>>>>>>>>>> >>>>>>>>>>> Edoardo Alinovi, Ph.D. >>>>>>>>>>> >>>>>>>>>>> DICCA, Scuola Politecnica >>>>>>>>>>> Universita' di Genova >>>>>>>>>>> 1, via Montallegro >>>>>>>>>>> 16145 Genova, Italy >>>>>>>>>>> >>>>>>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>>>>>> Tel: +39 010 353 2540 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Nov 8 11:10:37 2018 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 8 Nov 2018 12:10:37 -0500 Subject: [petsc-users] [petsc-maint] Correct use of PCFactorSetMatOrderingType In-Reply-To: References: Message-ID: I am not that familiar with hypre's options. AMG is complicated and I barely keep my options straight. OpenFOAM seems to have highly specialized solvers so being with 50% of them is decent. On Thu, Nov 8, 2018 at 12:03 PM Edoardo alinovi wrote: > Yes, it is like you are saying. This is mostly due to the time employed by > ksp to solve the pressure equation. However, I have worked a lot on the > problem and I have found out that the default configuration is far to be > the optimal one, at least in this case. > > Actually my cpu time is decreased by more than twice with respect to the > default configuration. I put here the changes, maybe they can be usefull to > other users in the future: > > -pc_hypre_boomeramg_no_CF > -pc_hypre_boomeramg_agg_nl 1 > pc_hypre_boomeramg_coarsen_type HMIS > -pc_hypre_boomeramg_interp_type FF1 > > The last two seem to be essential in enhancing performances. > > Do you know other configurations that worth to test? Also, I would like > to know if a list of command options for hypre is available somewhere. > Looking at hypre's doc there are a lot of options, but I do not exactly > know which one is available in petsc and their name. > > Thank you very much for the kind support and sorry for the long emal.! > > Il giorno gio 8 nov 2018, 17:32 Mark Adams ha scritto: > >> To repeat: >> >> You seem to be saying that OpenFOAM solves the problem in 10 seconds and >> PETSc solves it in 14 seconds. Is that correct? >> >> >> >> On Thu, Nov 8, 2018 at 3:42 AM Edoardo alinovi via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Hello Mark, >>> >>> Yes, there are 5 KSP calls within a time-step (3 for the solution of >>> momentum equation + 2 for the solution of pressure), this is the classical >>> non iterative PISO by Issa ( the exact sequence of operations is : solve >>> momentum implicitly, solve pressure-correction, momentum explicitly, >>> pressure correction). The pressure correction equation ,which is something >>> similar to a Poisson equation for incompressible flows, is the one that >>> determines the overall performance in my code such as in the others. >>> Usually, when the pressure is being solved for the second time, the >>> solution is faster since there is a better input guess and, as in my case, >>> the preconditioner is not recomputed again. >>> >>> Have you got some advices for the multigrid configuration in this >>> scenario, which are not the default one, in order to increase performances? >>> >>> I do not know if this may impact drastically the performance, but I am >>> running on a E4 workstation with 16 Intel's Xeon processors (2.3GH/12MB >>> cache) and 128GB of RAM . >>> >>> Thank you very much for your helpful comments, >>> >>> >>> Edoardo >>> ------ >>> >>> Edoardo Alinovi, Ph.D. >>> >>> DICCA, Scuola Politecnica >>> Universita' di Genova >>> 1, via Montallegro >>> 16145 Genova, Italy >>> >>> email: edoardo.alinovi at dicca.unige.it >>> Tel: +39 010 353 2540 >>> >>> >>> >>> >>> Il giorno mer 7 nov 2018 alle ore 17:59 Mark Adams ha >>> scritto: >>> >>>> please respond to petsc-users. >>>> >>>> You are doing 5 solves here in 14 seconds. You seem to be saying that >>>> the two pressure solves are taking all of this time. I don't know why the >>>> two solves are different. >>>> >>>> You seem to be saying that OpenFOAM solves the problem in 10 seconds >>>> and PETSc solves it in 14 seconds. Is that correct? Hypre seems to be >>>> running fine. >>>> >>>> >>>> >>>> On Wed, Nov 7, 2018 at 11:24 AM Edoardo alinovi < >>>> edoardo.alinovi at gmail.com> wrote: >>>> >>>>> Thanks a lot Mark for your kind replay. The solver is mine and I use >>>>> PETSc for the solution of momentum and pressure. The first is solved very >>>>> fast by a standard bcgs + bjacobi, but the pressure is the source of all >>>>> evils and, unfortunately, I am pretty sure that almost all the time within >>>>> the time-step is needed by KSP to solve the pressure (see log attached). I >>>>> have verified this also putting a couple of mpi_wtime around the kspsolve >>>>> call. The pressure is solved 2 times (1 prediction + 1 correction), the >>>>> prediction takes around 11s , the correction around 4s (here I am avoiding >>>>> to recompute the preconditioner), all the rest of the code (flux assembling >>>>> + mometum solution + others) around 1s. Openfoam does the same procedure >>>>> with the same tolerance in 10s using its gamg version (50 it to converge). >>>>> The number of iteration required to solve the pressure with hypre are 12. >>>>> Gamg performs similarly to hypre in terms of speed, but with 50 iterations >>>>> to converge. Am I missing something in the setup in your opinion? >>>>> >>>>> thanks a lot, >>>>> >>>>> Edo >>>>> >>>>> ------ >>>>> >>>>> Edoardo Alinovi, Ph.D. >>>>> >>>>> DICCA, Scuola Politecnica >>>>> Universita' di Genova >>>>> 1, via Montallegro >>>>> 16145 Genova, Italy >>>>> >>>>> email: edoardo.alinovi at dicca.unige.it >>>>> Tel: +39 010 353 2540 >>>>> >>>>> >>>>> >>>>> >>>>> Il giorno mer 7 nov 2018 alle ore 16:50 Mark Adams >>>>> ha scritto: >>>>> >>>>>> You can try -pc_type gamg, but hypre is a pretty good solver for the >>>>>> Laplacian. If hypre is just a little faster than LU on a 3D problem (that >>>>>> takes 10 seconds to solve) then AMG is not doing well. I would expect that >>>>>> AMG is taking a lot of iterations (eg, >> 10). You can check that with >>>>>> -ksp_monitor. >>>>>> >>>>>> The PISO algorithm is a multistage algorithm with a pressure >>>>>> correction in it. It also has a solve for the velocity, from what I can >>>>>> tell. Are you building PISO yourself and using PETSc just for the pressure >>>>>> correction? Are you sure the time is spent in this solver? You can use >>>>>> -log_view to see performance numbers and look for KSPSolve to see how much >>>>>> time is spent in the PETSc solver. >>>>>> >>>>>> Mark >>>>>> >>>>>> >>>>>> On Wed, Nov 7, 2018 at 10:26 AM Zhang, Hong via petsc-maint < >>>>>> petsc-maint at mcs.anl.gov> wrote: >>>>>> >>>>>>> Edoardo: >>>>>>> Forwarding your request to petsc-maint where you can get fast and >>>>>>> expert advise. I do not have suggestion for your application, but someone >>>>>>> in our team likely will make suggestion. >>>>>>> Hong >>>>>>> >>>>>>> Hello Hong, >>>>>>>> >>>>>>>> Well, using -sub_pc_type lu it super slow. I am >>>>>>>> desperately triying to enhance performaces of my code (CFD, finite volume, >>>>>>>> PISO alghoritm), in particular I have a strong bottleneck in the solution >>>>>>>> of pressure correction equation which takes almost the 90% of computational >>>>>>>> time. Using multigrid as preconditoner (hypre with default options) is >>>>>>>> slighlty better, but comparing the results against the multigrid used in >>>>>>>> openFOAM, my code is losing 10s/iteration which a huge amount of time. Now, >>>>>>>> since that all the time is employed by KSPSolve, I feel a bit powerless. >>>>>>>> Do you have any helpful advice? >>>>>>>> >>>>>>>> Thank you very much! >>>>>>>> ------ >>>>>>>> >>>>>>>> Edoardo Alinovi, Ph.D. >>>>>>>> >>>>>>>> DICCA, Scuola Politecnica >>>>>>>> Universita' di Genova >>>>>>>> 1, via Montallegro >>>>>>>> 16145 Genova, Italy >>>>>>>> >>>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>>> Tel: +39 010 353 2540 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> Il giorno mar 6 nov 2018 alle ore 17:15 Zhang, Hong < >>>>>>>> hzhang at mcs.anl.gov> ha scritto: >>>>>>>> >>>>>>>>> Edoardo: >>>>>>>>> Interesting. I thought it would not affect performance much. What >>>>>>>>> happens if you use -sub_pc_type lu'? >>>>>>>>> Hong >>>>>>>>> >>>>>>>>> Dear Hong and Matt, >>>>>>>>>> >>>>>>>>>> thank you for your kind replay. I have just tested your >>>>>>>>>> suggestions and applied " -sub_pc_type ilu -sub_pc_factor_mat_ordering_type >>>>>>>>>> nd/rcm" and, in both cases, I have found a deterioration of >>>>>>>>>> performances with respect to doing nothing (thus just putting default >>>>>>>>>> PCBJACOBI). Is it normal? However, I guess this is very problem dependent. >>>>>>>>>> ------ >>>>>>>>>> >>>>>>>>>> Edoardo Alinovi, Ph.D. >>>>>>>>>> >>>>>>>>>> DICCA, Scuola Politecnica >>>>>>>>>> Universita' di Genova >>>>>>>>>> 1, via Montallegro >>>>>>>>>> 16145 Genova, Italy >>>>>>>>>> >>>>>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>>>>> Tel: +39 010 353 2540 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Il giorno mar 6 nov 2018 alle ore 16:04 Zhang, Hong < >>>>>>>>>> hzhang at mcs.anl.gov> ha scritto: >>>>>>>>>> >>>>>>>>>>> Edoardo: >>>>>>>>>>> You can test runtime option '-sub_pc_factor_mat_ordering_type' >>>>>>>>>>> and use '-log_view' to get performance on different orderings, >>>>>>>>>>> e.g.,petsc/src/ksp/ksp/examples/tutorials/ex2.c: >>>>>>>>>>> mpiexec -n 2 ./ex2 -ksp_view -sub_pc_type ilu >>>>>>>>>>> -sub_pc_factor_mat_ordering_type nd >>>>>>>>>>> >>>>>>>>>>> I do not think the ordering inside block for ilu would affect >>>>>>>>>>> performance much. Let us know what you will get. >>>>>>>>>>> Hong >>>>>>>>>>> >>>>>>>>>>> Dear users, >>>>>>>>>>>> >>>>>>>>>>>> I have a question about the correct use of the option >>>>>>>>>>>> "PCFactorSetMatOrderingType" in PETSc. >>>>>>>>>>>> >>>>>>>>>>>> I am solving a problem with 2.5M of unknowns distributed along >>>>>>>>>>>> 16 processors and I am using the block jacobi preconditioner and MPIAIJ >>>>>>>>>>>> matrix format. I cannot figure out if the above option can be useful or not >>>>>>>>>>>> in decreasing the computational time. Any suggestion or tips? >>>>>>>>>>>> >>>>>>>>>>>> Thank you very much for the kind help >>>>>>>>>>>> >>>>>>>>>>>> ------ >>>>>>>>>>>> >>>>>>>>>>>> Edoardo Alinovi, Ph.D. >>>>>>>>>>>> >>>>>>>>>>>> DICCA, Scuola Politecnica >>>>>>>>>>>> Universita' di Genova >>>>>>>>>>>> 1, via Montallegro >>>>>>>>>>>> 16145 Genova, Italy >>>>>>>>>>>> >>>>>>>>>>>> email: edoardo.alinovi at dicca.unige.it >>>>>>>>>>>> Tel: +39 010 353 2540 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Nov 8 11:14:26 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Thu, 8 Nov 2018 17:14:26 +0000 Subject: [petsc-users] How to fix max linear steps in SNES In-Reply-To: References: Message-ID: <700412EE-6786-4CA6-ACF5-15F78B17DC94@anl.gov> > On Nov 8, 2018, at 9:06 AM, Yingjie Wu via petsc-users wrote: > > Dear Petsc developer: > Hi, > I recently debugged my program, which is a two-dimensional nonlinear PDEs problem, and solved by SNES. I find that the residual drop in KSP is slow. I want to fix the number of steps in the linear step, because I can not choose a suitable ksp_rtol. I use the command: -ksp_max_it 200, and I want to fix the number of iterations per ksp. But the program seemed to stop in the first nonlinear step. > > Linear solve did not converge due to DIVERGED_ITS iterations 200 > Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 > > How can I fix the maximum number of iterations of linear steps without stopping the program before the convergence of the non-linear steps? -snes_max_linear_solve_fail 10000000 (some large number) Barry > > Thanks, > Yingjie From jed at jedbrown.org Thu Nov 8 13:53:52 2018 From: jed at jedbrown.org (Jed Brown) Date: Thu, 08 Nov 2018 12:53:52 -0700 Subject: [petsc-users] Vec, Mat and binaryfiles. In-Reply-To: References: <87r2fzpj0m.fsf@jedbrown.org> <87muqkna5e.fsf@jedbrown.org> Message-ID: <87a7mjl50v.fsf@jedbrown.org> Sal Am writes: > Yes I was just hoping there'd be more than that. > > I have tried using one of them as basis: > > import PetscBinaryIO > import numpy as np > import scipy.sparse > > b_vector = np.array(np.fromfile('Vector_b.bin' > ,dtype=np.dtype((np.float64,2)))); > A_matrix = > np.array(np.fromfile('Matrix_A_cmplx.bin',dtype=np.dtype((np.float64,2)))); > A_indexs = np.array(np.fromfile('Matrix_A_int.bin' > ,dtype=np.dtype((np.int32 ,2)))); > > b_vector = b_vector[:,0] + 1j*b_vector[:,1] > A_matrix = A_matrix[:,0] + 1j*A_matrix[:,1] > > A_matrix = scipy.sparse.csr_matrix((A_matrix, (A_indexs[:,0]-1, > A_indexs[:,1]-1))) > > vec = b_vector.view(PetscBinaryIO.Vec) > > io = PetscBinaryIO.PetscBinaryIO() > io.writeBinaryFile('Vector_b.dat', [vec,]) > io.writeMatSciPy('Matrix_A.dat', A_matrix) The "fh" argument to writeMatSciPy is a file handle. You can use writeBinaryFile with matrices, in one file or separate files. > I was able to do it convert and read the Vector_b in fine. > So what I am having trouble with is actually converting the > scipy.sparse.csr_matrix into the PETSc format. I thought the writeMatSciPy > would do that by calling writeMatSparse? > Maybe I have misunderstood something here? > > All the best, > S > > On Thu, Nov 8, 2018 at 10:10 AM Matthew Knepley wrote: > >> On Thu, Nov 8, 2018 at 4:56 AM Sal Am via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Thanks, I missed that, >>> >>> I do care about scalability, the long term plan is to make it fully >>> compatible on a cluster so I guess I would need to convert it to PETSc >>> binary format before reading it in. >>> >>> Is this the file you were referring to >>> https://github.com/erdc/petsc-dev/blob/master/bin/pythonscripts/PetscBinaryIO.py >>> >>> Cannot seem to find docs on how to use it, do you happen to have any >>> examples? >>> >> >> The documentation is at the top of the file. You can see it by clicking >> the link. >> >> Matt >> >> >>> All the best, >>> S >>> >>> >>> >>> >>> On Wed, Nov 7, 2018 at 4:07 PM Jed Brown wrote: >>> >>>> Please always use "reply-all" so that your messages go to the list. >>>> This is standard mailing list etiquette. It is important to preserve >>>> threading for people who find this discussion later and so that we do >>>> not waste our time re-answering the same questions that have already >>>> been answered in private side-conversations. You'll likely get an >>>> answer faster that way too. >>>> >>>> Sal Am writes: >>>> >>>> > Thank you Jed for the quick response! >>>> > >>>> >> >>>> >> Yes, of course the formats would have to match. I would recommend >>>> >> writing the files in an existing format such as PETSc's binary format. >>>> >> >>>> > >>>> > Unfortunately I do not think I can change the source code to output >>>> those >>>> > two files in PETSc format and it would probably take a very long time >>>> > converting everything into PETSc (it is not even my code). >>>> >>>> File-based workflows are very often bottlenecks. You can also use any >>>> convenient software (e.g., Python or Matlab/Octave) to convert your >>>> custom binary formats to PETSc binary format (see PetscBinaryIO provided >>>> with PETSc), at which point you'll be able to read in parallel. If you >>>> don't care about scalability or only need to read once, then you can >>>> write code of the type you propose. >>>> >>>> > Do you have any other suggestions on how to read in those two complex >>>> > binary files? Also why would it be more difficult to parallelise as >>>> > I thought getting the two files in PETSc vector format would allow me >>>> to >>>> > use the rest of PETSc library the usual way? >>>> > >>>> > Kind regards, >>>> > S >>>> > >>>> > >>>> > On Mon, Nov 5, 2018 at 4:49 PM Jed Brown wrote: >>>> > >>>> >> Sal Am via petsc-users writes: >>>> >> >>>> >> > Hi, >>>> >> > >>>> >> > I am trying to solve a Ax=b complex system. the vector b and >>>> "matrix" A >>>> >> are >>>> >> > both binary and NOT created by PETSc. So I keep getting error >>>> messages >>>> >> that >>>> >> > they are not correct format when I read the files with >>>> >> PetscViewBinaryOpen, >>>> >> > after some digging it seems that one cannot just read a binary file >>>> that >>>> >> > was created by another software. >>>> >> >>>> >> Yes, of course the formats would have to match. I would recommend >>>> >> writing the files in an existing format such as PETSc's binary format. >>>> >> While the method you describe can be made to work, it will be more >>>> work >>>> >> to make it parallel. >>>> >> >>>> >> > How would I go on to solve this problem? >>>> >> > >>>> >> > More info and trials: >>>> >> > >>>> >> > "matrix" A consists of two files, one that contains row column index >>>> >> > numbers and one that contains the non-zero values. So what I would >>>> have >>>> >> to >>>> >> > do is multiply the last term in a+b with PETSC_i to get a real + >>>> >> imaginary >>>> >> > vector A. >>>> >> > >>>> >> > vector b is in binary, so what I have done so far (not sure if it >>>> works) >>>> >> is: >>>> >> > >>>> >> > std::ifstream input("Vector_b.bin", std::ios::binary ); >>>> >> > while (input.read(reinterpret_cast(&v), sizeof(float))) >>>> >> > ierr = >>>> VecSetValues(u,1,&iglobal,&v,INSERT_VALUES);CHKERRQ(ierr); >>>> >> > >>>> >> > where v is a PetscScalar. >>>> >> > >>>> >> > Once I am able to read both matrices I think I can figure out the >>>> solvers >>>> >> > to solve the system. >>>> >> > >>>> >> > All the best, >>>> >> > S >>>> >> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> From aroli.marcellinus at gmail.com Thu Nov 8 19:05:36 2018 From: aroli.marcellinus at gmail.com (Aroli Marcellinus) Date: Fri, 9 Nov 2018 10:05:36 +0900 Subject: [petsc-users] Combine PETSc with CVode example Message-ID: Hi, Is there any simple example about using CVode in PETSc properly? Like solving ODE in each node at some 3D-mesh? Thank you. Aroli Marcellinus *Kumoh Institute of Technology**Computational Medicine Laboratory* 61 Daehak-ro (Sinpyeong-dong), Gumi, Gyeongbuk +82 10 9724 3957 KTalk ID: vondarkness -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Nov 8 21:14:48 2018 From: jed at jedbrown.org (Jed Brown) Date: Thu, 08 Nov 2018 20:14:48 -0700 Subject: [petsc-users] Combine PETSc with CVode example In-Reply-To: References: Message-ID: <877ehnj61j.fsf@jedbrown.org> All PETSc TS examples can be run with -ts_type sundials. Aroli Marcellinus via petsc-users writes: > Hi, > > > Is there any simple example about using CVode in PETSc properly? Like > solving ODE in each node at some 3D-mesh? > > Thank you. > > Aroli Marcellinus > > *Kumoh Institute of Technology**Computational Medicine Laboratory* > 61 Daehak-ro (Sinpyeong-dong), Gumi, Gyeongbuk > +82 10 9724 3957 > KTalk ID: vondarkness From bsmith at mcs.anl.gov Thu Nov 8 22:06:11 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 9 Nov 2018 04:06:11 +0000 Subject: [petsc-users] Combine PETSc with CVode example In-Reply-To: <877ehnj61j.fsf@jedbrown.org> References: <877ehnj61j.fsf@jedbrown.org> Message-ID: > On Nov 8, 2018, at 9:14 PM, Jed Brown via petsc-users wrote: > > All PETSc TS examples can be run with -ts_type sundials. Note that in general PETSc uses its integrators across the entire grid, not a separate integrator at each point, though that is possible. You need to create a TS object for each grid point, provide the appropriate function (that operates on that one point) and then set the integrator to sundials. Barry > > Aroli Marcellinus via petsc-users writes: > >> Hi, >> >> >> Is there any simple example about using CVode in PETSc properly? Like >> solving ODE in each node at some 3D-mesh? >> >> Thank you. >> >> Aroli Marcellinus >> >> *Kumoh Institute of Technology**Computational Medicine Laboratory* >> 61 Daehak-ro (Sinpyeong-dong), Gumi, Gyeongbuk >> +82 10 9724 3957 >> KTalk ID: vondarkness From bsmith at mcs.anl.gov Fri Nov 9 13:04:37 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 9 Nov 2018 19:04:37 +0000 Subject: [petsc-users] KSPsolve performance tuning In-Reply-To: References: <37A4F874-4BE1-4F14-9497-7EFFBD829716@mcs.anl.gov> Message-ID: <26A5C06A-AFDF-4868-9028-2F28C77A5916@mcs.anl.gov> The code below looks ok (you don't need to repeatedly call KSPSetOperators() but, you only have to have called that once). You can run with -info | grep "Leaving PC with identical preconditioner since reuse preconditioner is set" to see if the flag is being respected. See src/ksp/pc/interface/precon.c and the function PCSetUp() for how it determines if the preconditioner should be rebuilt. Barry > On Nov 9, 2018, at 4:52 AM, Edoardo alinovi wrote: > > Hello Barry, > > sorry for digging up again this post, but I have a doubt on how reuse a preconditioner. > > Acuttaly I am calling the function in this order > > - refill the new matrix > > - MatAssemblyBegin/end > > - KSPSetReusePreconditioner(ksp, PETSC_TRUE,ierr) > > - KSPSetOperators(ksp,A,A,ierr) > > - KSPSolve(ksp,rhs,x,ierr) > > Is this correct? Morever is there a way to check if the precondtioner has been reused or not? > > Thank you very much for your kind comments > > ------ > > Edoardo Alinovi, Ph.D. > > DICCA, Scuola Politecnica, > Universita' degli Studi di Genova, > 1, via Montallegro, > 16145 Genova, Italy > > Email: edoardo.alinovi at dicca.unige.it > Tel: +39 010 353 2540 > > > > > Il giorno gio 13 set 2018 alle ore 21:25 Edoardo alinovi ha scritto: > Yes, this is due to the fact that the convection are treated implicitly in my code. This gives more stability, but at the same time each time step the coffs changes due to the change of mass fluxes. I can eventually avoid the pc set up during non orthoghonal correction where only rhs changes. > > Il 13 set 2018 21:19, "Smith, Barry F." ha scritto: > > > > On Sep 13, 2018, at 2:16 PM, Edoardo alinovi wrote: > > > > Yes, I need it or at least I think so. I am using piso alghorithm which requires to solve the pressure eq twice in a time step. Since the coeffs of the matrix are different in the two cases, then I have to setup the pc every time. > > What about at the next time-step? Are the coefficients different yet again at each new timestep? > > > Barry > > > > > > Thanks Barry! > > > > Il gio 13 set 2018, 21:08 Smith, Barry F. ha scritto: > > > > > > > On Sep 13, 2018, at 1:55 PM, Edoardo alinovi wrote: > > > > > > Hello Barry, > > > > > > Thank you very much for your replay! Maybe the best feature of PETSc is this mailing list :) You are right cg + block Jcoby is twice faster than hypre, but still a bit slower than OF. I have definitely to test a bigger case. It seems that hypre loses a lot of time in setting up the PC. Is this normal? > > > > Yes, BoomerAMG (or any AMG method) has "large" setup times that don't scale as well as the solve time. Thus if you can reuse the setup for multiple solves you gain a great deal. The best case is, of course, where the same Poisson problem (with a different right hand side obviously) has to be solved many times. I noticed in your log_view that it seems to need to do a new setup for each solve. > > > > Barry > > > > > > > > Thank you! > > > > > > > > > > > > ------ > > > > > > Edoardo Alinovi, Ph.D. > > > > > > DICCA, Scuola Politecnica > > > Universita' di Genova > > > 1, via Montallegro > > > 16145 Genova, Italy > > > > > > email: edoardo.alinovi at dicca.unige.it > > > Tel: +39 010 353 2540 > > > > > > > > > > > > 2018-09-13 20:10 GMT+02:00 Smith, Barry F. : > > > > > > What pressure solver is OpenFOAM using? Are you using the same convergence tolerance for the pressure solver for the two approaches? Have you tried PETSc with a simpler solver: -pc_type bjacobi or -pc_type asm ; the problem is pretty small and maybe at this size hypre is overkill? > > > > > > Barry > > > > > > > > > > On Sep 13, 2018, at 12:58 PM, Edoardo alinovi wrote: > > > > > > > > Hello PETSc's frieds, > > > > > > > > It is a couple of weeks that I am trying to enhance the perforamance of my code. Actually I am solving NS equation for a 3D problem of 220k cells with 4procs on my laptop (i7-7800k @ 2.3Ghz with dynamic overclocking). I have installed petsc under linux suse 15 in a virtual machine (I do not know if this is important or not). > > > > > > > > After some profiling, I can see that the bottle neck is inside KSPSolve while solving pressure equation (solved with cg + hypre pc). For this reason my code is running twice time slower than openFOAM and the gap is only due the solution of pressure. Have you got some hints for me? At this point I am sure I am doing something wrong! I have attached the log of a test simulation. > > > > > > > > Thank you very much! > > > > > > > > ------ > > > > > > > > Edoardo Alinovi, Ph.D. > > > > > > > > DICCA, Scuola Politecnica > > > > Universita' di Genova > > > > 1, via Montallegro > > > > 16145 Genova, Italy > > > > > > > > email: edoardo.alinovi at dicca.unige.it > > > > Tel: +39 010 353 2540 > > > > > > > > > > > > > > > > > > > > > > From jychang48 at gmail.com Fri Nov 9 17:46:29 2018 From: jychang48 at gmail.com (Justin Chang) Date: Fri, 9 Nov 2018 16:46:29 -0700 Subject: [petsc-users] Turn off "Error on nonconvergence" for KSP solve Message-ID: Hi all, When using KSP, is it possible to not thrown in error (i.e., DIVERGED_ITS) when -ksp_max_it is reached? Thanks, Justin -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Fri Nov 9 17:55:45 2018 From: jed at jedbrown.org (Jed Brown) Date: Fri, 09 Nov 2018 16:55:45 -0700 Subject: [petsc-users] Turn off "Error on nonconvergence" for KSP solve In-Reply-To: References: Message-ID: <8736s9iz5q.fsf@jedbrown.org> Justin Chang via petsc-users writes: > Hi all, > > When using KSP, is it possible to not thrown in error (i.e., DIVERGED_ITS) > when -ksp_max_it is reached? You can use KSPConvergedSkip or KSP_NORM_NONE to exit with KSP_CONVERGED_ITS. Note that setting the reason to DIVERGED_ITS is not "throwing an error". There is -ksp_error_if_not_converged that can be used to actually raise an error. If you want to use a real convergence test, but not treat failing to converge as an error, you should tell the code that calls KSPSolve to accept KSP_DIVERGED_ITS. For example, there is -snes_max_linear_solve_fail. From tempohoper at gmail.com Mon Nov 12 04:02:46 2018 From: tempohoper at gmail.com (Sal Am) Date: Mon, 12 Nov 2018 10:02:46 +0000 Subject: [petsc-users] Vec, Mat and binaryfiles. In-Reply-To: <87a7mjl50v.fsf@jedbrown.org> References: <87r2fzpj0m.fsf@jedbrown.org> <87muqkna5e.fsf@jedbrown.org> <87a7mjl50v.fsf@jedbrown.org> Message-ID: For anyone in the future, to read in binary files written using raw C++ write and to convert to PETSc format using the PetscBinaryIO.py script: """ Ax = b Where mat A is divided into A index and A non-zero elements. """ import PetscBinaryIO import numpy as np import scipy.sparse b_vector = np.array(np.fromfile('Vector_b.bin' ,dtype=np.dtype((np.float64,2)))); A_matrix = np.array(np.fromfile('Matrix_A_cmplx.bin',dtype=np.dtype((np.float64,2)))); # non-zero elements. A_indexs = np.array(np.fromfile('Matrix_A_int.bin' ,dtype=np.dtype((np.int32 ,2)))); # row column indexes for the non zero elements b_vector = b_vector[:,0] + 1j*b_vector[:,1] A_matrix = A_matrix[:,0] + 1j*A_matrix[:,1] A_matrix = scipy.sparse.csr_matrix((A_matrix, (A_indexs[:,0]-1, A_indexs[:,1]-1))) vec = b_vector.view(PetscBinaryIO.Vec) io = PetscBinaryIO.PetscBinaryIO() io.writeBinaryFile('Vector_b.dat', [vec,]) io.writeBinaryFile('Matrix_A.dat',[A_matrix,]) This will output two PETSc ready files Vector_b.dat and Matrix_A.dat that you can exploit further with Vec/MatLoad or similar. Cheers, S On Thu, Nov 8, 2018 at 7:53 PM Jed Brown wrote: > Sal Am writes: > > > Yes I was just hoping there'd be more than that. > > > > I have tried using one of them as basis: > > > > import PetscBinaryIO > > import numpy as np > > import scipy.sparse > > > > b_vector = np.array(np.fromfile('Vector_b.bin' > > ,dtype=np.dtype((np.float64,2)))); > > A_matrix = > > > np.array(np.fromfile('Matrix_A_cmplx.bin',dtype=np.dtype((np.float64,2)))); > > A_indexs = np.array(np.fromfile('Matrix_A_int.bin' > > ,dtype=np.dtype((np.int32 ,2)))); > > > > b_vector = b_vector[:,0] + 1j*b_vector[:,1] > > A_matrix = A_matrix[:,0] + 1j*A_matrix[:,1] > > > > A_matrix = scipy.sparse.csr_matrix((A_matrix, (A_indexs[:,0]-1, > > A_indexs[:,1]-1))) > > > > vec = b_vector.view(PetscBinaryIO.Vec) > > > > io = PetscBinaryIO.PetscBinaryIO() > > io.writeBinaryFile('Vector_b.dat', [vec,]) > > io.writeMatSciPy('Matrix_A.dat', A_matrix) > > The "fh" argument to writeMatSciPy is a file handle. You can use > writeBinaryFile with matrices, in one file or separate files. > > > I was able to do it convert and read the Vector_b in fine. > > So what I am having trouble with is actually converting the > > scipy.sparse.csr_matrix into the PETSc format. I thought the > writeMatSciPy > > would do that by calling writeMatSparse? > > Maybe I have misunderstood something here? > > > > All the best, > > S > > > > On Thu, Nov 8, 2018 at 10:10 AM Matthew Knepley > wrote: > > > >> On Thu, Nov 8, 2018 at 4:56 AM Sal Am via petsc-users < > >> petsc-users at mcs.anl.gov> wrote: > >> > >>> Thanks, I missed that, > >>> > >>> I do care about scalability, the long term plan is to make it fully > >>> compatible on a cluster so I guess I would need to convert it to PETSc > >>> binary format before reading it in. > >>> > >>> Is this the file you were referring to > >>> > https://github.com/erdc/petsc-dev/blob/master/bin/pythonscripts/PetscBinaryIO.py > >>> > >>> Cannot seem to find docs on how to use it, do you happen to have any > >>> examples? > >>> > >> > >> The documentation is at the top of the file. You can see it by clicking > >> the link. > >> > >> Matt > >> > >> > >>> All the best, > >>> S > >>> > >>> > >>> > >>> > >>> On Wed, Nov 7, 2018 at 4:07 PM Jed Brown wrote: > >>> > >>>> Please always use "reply-all" so that your messages go to the list. > >>>> This is standard mailing list etiquette. It is important to preserve > >>>> threading for people who find this discussion later and so that we do > >>>> not waste our time re-answering the same questions that have already > >>>> been answered in private side-conversations. You'll likely get an > >>>> answer faster that way too. > >>>> > >>>> Sal Am writes: > >>>> > >>>> > Thank you Jed for the quick response! > >>>> > > >>>> >> > >>>> >> Yes, of course the formats would have to match. I would recommend > >>>> >> writing the files in an existing format such as PETSc's binary > format. > >>>> >> > >>>> > > >>>> > Unfortunately I do not think I can change the source code to output > >>>> those > >>>> > two files in PETSc format and it would probably take a very long > time > >>>> > converting everything into PETSc (it is not even my code). > >>>> > >>>> File-based workflows are very often bottlenecks. You can also use any > >>>> convenient software (e.g., Python or Matlab/Octave) to convert your > >>>> custom binary formats to PETSc binary format (see PetscBinaryIO > provided > >>>> with PETSc), at which point you'll be able to read in parallel. If > you > >>>> don't care about scalability or only need to read once, then you can > >>>> write code of the type you propose. > >>>> > >>>> > Do you have any other suggestions on how to read in those two > complex > >>>> > binary files? Also why would it be more difficult to parallelise as > >>>> > I thought getting the two files in PETSc vector format would allow > me > >>>> to > >>>> > use the rest of PETSc library the usual way? > >>>> > > >>>> > Kind regards, > >>>> > S > >>>> > > >>>> > > >>>> > On Mon, Nov 5, 2018 at 4:49 PM Jed Brown wrote: > >>>> > > >>>> >> Sal Am via petsc-users writes: > >>>> >> > >>>> >> > Hi, > >>>> >> > > >>>> >> > I am trying to solve a Ax=b complex system. the vector b and > >>>> "matrix" A > >>>> >> are > >>>> >> > both binary and NOT created by PETSc. So I keep getting error > >>>> messages > >>>> >> that > >>>> >> > they are not correct format when I read the files with > >>>> >> PetscViewBinaryOpen, > >>>> >> > after some digging it seems that one cannot just read a binary > file > >>>> that > >>>> >> > was created by another software. > >>>> >> > >>>> >> Yes, of course the formats would have to match. I would recommend > >>>> >> writing the files in an existing format such as PETSc's binary > format. > >>>> >> While the method you describe can be made to work, it will be more > >>>> work > >>>> >> to make it parallel. > >>>> >> > >>>> >> > How would I go on to solve this problem? > >>>> >> > > >>>> >> > More info and trials: > >>>> >> > > >>>> >> > "matrix" A consists of two files, one that contains row column > index > >>>> >> > numbers and one that contains the non-zero values. So what I > would > >>>> have > >>>> >> to > >>>> >> > do is multiply the last term in a+b with PETSC_i to get a real + > >>>> >> imaginary > >>>> >> > vector A. > >>>> >> > > >>>> >> > vector b is in binary, so what I have done so far (not sure if it > >>>> works) > >>>> >> is: > >>>> >> > > >>>> >> > std::ifstream input("Vector_b.bin", std::ios::binary ); > >>>> >> > while (input.read(reinterpret_cast(&v), sizeof(float))) > >>>> >> > ierr = > >>>> VecSetValues(u,1,&iglobal,&v,INSERT_VALUES);CHKERRQ(ierr); > >>>> >> > > >>>> >> > where v is a PetscScalar. > >>>> >> > > >>>> >> > Once I am able to read both matrices I think I can figure out the > >>>> solvers > >>>> >> > to solve the system. > >>>> >> > > >>>> >> > All the best, > >>>> >> > S > >>>> >> > >>>> > >>> > >> > >> -- > >> What most experimenters take for granted before they begin their > >> experiments is infinitely more interesting than any results to which > their > >> experiments lead. > >> -- Norbert Wiener > >> > >> https://www.cse.buffalo.edu/~knepley/ > >> > >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From heeho.park at gmail.com Mon Nov 12 17:27:27 2018 From: heeho.park at gmail.com (HeeHo Park) Date: Mon, 12 Nov 2018 17:27:27 -0600 Subject: [petsc-users] Can't figure out why I get inf solutions. In-Reply-To: References: Message-ID: Since I can't send binary files, I hardcoded the values on main.py. You only need main.py file to run the code. Thanks, On Mon, Nov 12, 2018 at 5:17 PM HeeHo Park wrote: > Hi PETSc, > > These Python looks complicated but it really isn't. please run main.py > > I am writing up a script that would read petsc binary files extracted from > one of PFLOTRAN newton iteration step and solve with petsc4py to > investigate linear solvers. > > But I am having trouble solving a matrix that converges PFLOTRAN, let > alone running a trimmed size (15x15) matrix of A. You'll see in the > script. > > I don't think there is any problem with loading sparse.csr_matrix onto > PETSc and same with array as I tested with very simple matrix which it > solves. > > Can you figure out why I'm getting inf solutions? > > If I use the sparse matrix package to solve it, it solves it very quickly. > > x = sla.spsolve(A,b) > > Thanks, > > -- > HeeHo Daniel Park > -- HeeHo Daniel Park -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: main.py Type: text/x-python Size: 7070 bytes Desc: not available URL: From sajidsyed2021 at u.northwestern.edu Mon Nov 12 20:52:50 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Mon, 12 Nov 2018 20:52:50 -0600 Subject: [petsc-users] Question about DMDAGetElements Message-ID: Hi, I'm trying to understand this example from a tutorial. DM is used as an alternative to setting up the matrix and vector separately (as was done in the previous example ). Since DMDACreate1d was used to create the mesh, can someone explain this logic more clearly : /* Gets an array containing the indices (in local coordinates) of all the local elements. nel - number of local elements nen - number of one element's nodes e - the local indices of the elements' vertices */ ierr = DMDAGetElements(da, &nel, &nen, &e);CHKERRQ(ierr); /* Assemble matrix and RHS */ value[0] = 1.0; value[1] = -1.0; value[2] = -1.0; value[3] = 1.0; bvalue[0] = 1.0; bvalue[1] = 1.0; for (i=0; i From jed at jedbrown.org Mon Nov 12 22:50:45 2018 From: jed at jedbrown.org (Jed Brown) Date: Mon, 12 Nov 2018 21:50:45 -0700 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: References: Message-ID: <87in11inru.fsf@jedbrown.org> Sajid Ali via petsc-users writes: > Hi, > > I'm trying to understand this example > > from a tutorial. DM is used as an alternative to setting up the matrix and > vector separately (as was done in the previous example > ). > > > Since DMDACreate1d was used to create the mesh, can someone explain this > logic more clearly : > > /* > Gets an array containing the indices (in local coordinates) > of all the local elements. > nel - number of local elements > nen - number of one element's nodes > e - the local indices of the elements' vertices > */ > ierr = DMDAGetElements(da, &nel, &nen, &e);CHKERRQ(ierr); > > /* > Assemble matrix and RHS > */ > value[0] = 1.0; value[1] = -1.0; value[2] = -1.0; value[3] = 1.0; > bvalue[0] = 1.0; bvalue[1] = 1.0; > for (i=0; i ierr = MatSetValuesLocal(A, 2, e+nen*i, 2, e+nen*i, value, > ADD_VALUES);CHKERRQ(ierr); > //TODO use VecSetValuesLocal() to set the values > } > ierr = DMDARestoreElements(da, &nel, &nen, &e);CHKERRQ(ierr); > > > The dm was created using > > ierr = MPI_Allreduce(&n, &N, 1, MPI_INT, MPI_SUM, > PETSC_COMM_WORLD);CHKERRQ(ierr); > ierr = DMDACreate1d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,N,1,1,NULL,&da);CHKERRQ(ierr); So each process owns 3 vertices. The first process owns two elements, then each subsequent process gets three elements. Only the first nel*nen entries in that array are valid. nen=2 in all cases because these are 1D linear elements. Use DMPlex if you want more general finite element mesh handling. > > Using n = 3 and I get the following results when I print the values of > nel,nen and e using this statement > PetscPrintf(PETSC_COMM_SELF,"nel : %d,nen : %d, e elements : > %d,%d,%d,%d,%d,%d,%d,%d ,rank > %d\n",nel,nen,e[0],e[1],e[2],e[3],e[4],e[5],e[6],e[7],rank); > > The output comes out as : > > [sajid at xrm exercises]$ mpirun -np 4 ./ex6 > nel : 2,nen : 2, e elements : 0,1,1,2,1936617321,0,3,0 ,rank 0 > nel : 3,nen : 2, e elements : 0,1,1,2,2,3,4,0 ,rank 1 > nel : 3,nen : 2, e elements : 0,1,1,2,2,3,4,0 ,rank 2 > nel : 3,nen : 2, e elements : 0,1,1,2,2,3,3,0 ,rank 3 > > Does this mean that I'm supposed to read the first 4 elements for rank 0 as > row0, col0, row1, col1? > > And how does differ If I'm creating a vector using DMCreate1d ? > > > Thank You, > Sajid Ali > Applied Physics > Northwestern University From amfoggia at gmail.com Tue Nov 13 03:58:25 2018 From: amfoggia at gmail.com (Ale Foggia) Date: Tue, 13 Nov 2018 10:58:25 +0100 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence Message-ID: Hello, I'm using SLEPc to get the smallest real eigenvalue (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). The linear size of the matrices I'm solving is around 10**9 elements and they are sparse. I've asked a few questions before regarding the same problem setting and you suggested me to use Krylov-Schur (because I was using Lanczos). I tried KS and up to a certain matrix size the convergence (relative to the eigenvalue) is good, it's around 10**-9, like with Lanczos, but when I increase the size I start getting the eigenvalue with only 3 correct digits. I've used the options: -eps_tol 1e-9 -eps_mpd 100 (16 was the default), but the only thing I got is one more eigenvalue with the same big error, and the iterations performed were only 2. Why didn't it do more in order to reach the convergence? Should I set other parameters? I don't know how to work out this problem, can you help me with this please? I send the -eps_view output and the eigenvalues with its errors: EPS Object: 2048 MPI processes type: krylovschur 50% of basis vectors kept after restart using the locking variant problem type: symmetric eigenvalue problem selected portion of the spectrum: smallest real parts number of eigenvalues (nev): 1 number of column vectors (ncv): 101 maximum dimension of projected problem (mpd): 100 maximum number of iterations: 46210024 tolerance: 1e-09 convergence test: relative to the eigenvalue BV Object: 2048 MPI processes type: svec 102 columns of global length 2333606220 vector orthogonalization method: classical Gram-Schmidt orthogonalization refinement: if needed (eta: 0.7071) block orthogonalization method: GS doing matmult as a single matrix-matrix product DS Object: 2048 MPI processes type: hep parallel operation mode: REDUNDANT solving the problem with: Implicit QR method (_steqr) ST Object: 2048 MPI processes type: shift shift: 0. number of matrices: 1 k ||Ax-kx||/||kx|| ----------------- ------------------ -15.093051 0.00323917 (with KS) -15.087320 0.00265215 (with KS) -15.048025 8.67204e-09 (with Lanczos) Iterations performed 2 Ale -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Tue Nov 13 05:34:47 2018 From: jroman at dsic.upv.es (Jose E. Roman) Date: Tue, 13 Nov 2018 12:34:47 +0100 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence In-Reply-To: References: Message-ID: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> This is really strange. We cannot say what is going on, everything seems fine. Could you try solving the problem as non-Hermitian to see what happens? Just run with -eps_non_hermitian. Depending on the result, we can suggest other things to try. Jose > El 13 nov 2018, a las 10:58, Ale Foggia via petsc-users escribi?: > > Hello, > > I'm using SLEPc to get the smallest real eigenvalue (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). The linear size of the matrices I'm solving is around 10**9 elements and they are sparse. I've asked a few questions before regarding the same problem setting and you suggested me to use Krylov-Schur (because I was using Lanczos). I tried KS and up to a certain matrix size the convergence (relative to the eigenvalue) is good, it's around 10**-9, like with Lanczos, but when I increase the size I start getting the eigenvalue with only 3 correct digits. I've used the options: -eps_tol 1e-9 -eps_mpd 100 (16 was the default), but the only thing I got is one more eigenvalue with the same big error, and the iterations performed were only 2. Why didn't it do more in order to reach the convergence? Should I set other parameters? I don't know how to work out this problem, can you help me with this please? I send the -eps_view output and the eigenvalues with its errors: > > EPS Object: 2048 MPI processes > type: krylovschur > 50% of basis vectors kept after restart > using the locking variant > problem type: symmetric eigenvalue problem > selected portion of the spectrum: smallest real parts > number of eigenvalues (nev): 1 > number of column vectors (ncv): 101 > maximum dimension of projected problem (mpd): 100 > maximum number of iterations: 46210024 > tolerance: 1e-09 > convergence test: relative to the eigenvalue > BV Object: 2048 MPI processes > type: svec > 102 columns of global length 2333606220 > vector orthogonalization method: classical Gram-Schmidt > orthogonalization refinement: if needed (eta: 0.7071) > block orthogonalization method: GS > doing matmult as a single matrix-matrix product > DS Object: 2048 MPI processes > type: hep > parallel operation mode: REDUNDANT > solving the problem with: Implicit QR method (_steqr) > ST Object: 2048 MPI processes > type: shift > shift: 0. > number of matrices: 1 > > k ||Ax-kx||/||kx|| > ----------------- ------------------ > -15.093051 0.00323917 (with KS) > -15.087320 0.00265215 (with KS) > -15.048025 8.67204e-09 (with Lanczos) > Iterations performed 2 > > Ale From griesser.jan at googlemail.com Tue Nov 13 05:40:05 2018 From: griesser.jan at googlemail.com (=?UTF-8?B?SmFuIEdyaWXDn2Vy?=) Date: Tue, 13 Nov 2018 12:40:05 +0100 Subject: [petsc-users] PetscInt overflow In-Reply-To: References: <97941479-55EC-4D8B-B8D4-13675EBBF628@dsic.upv.es> Message-ID: I am finally back from holidays and continued working on the problem. I think i figured out that the problem is not related to SLEPc but rather to PETSc. Until now i precomputed my matrix in python and saved it as a scipy.csr matrix( The matrix is of order 6 Mio x 6 Mio with around 0.004% nonzero elements). The code i used to convert the scipy sparse matrix to a PETSc matrix is appended below. But i am wondering if maybe the problem is that the memory of the individual processors(i use at the moment 60 cores with 24 GB each ) is simply dumped by the size of the matrix (It is around 13 Gb) or if the PETSc matrix is not correctly distributed. I am sorry for the late answer and hope someone has any suggestion. def construct_mat(): D_nn = scipy.sparse.load_npz("../PBC_DynamicalMatrix/DynamicalMatrixPB.npz") # Create a matrix Ds in parallel Dyn = PETSc.Mat().create() Dyn.setSizes(CSRMatrix.shape) Dyn.setFromOptions() Dyn.setUp() Rstart, Rend = Dyn.getOwnershipRange() # Fill the matrix csr = ( CSRMatrix.indptr[Rstart:Rend+1] - CSRMatrix.indptr[Rstart], CSRMatrix.indices[CSRMatrix.indptr[Rstart]:CSRMatrix.indptr[Rend]], CSRMatrix.data[CSRMatrix.indptr[Rstart]:CSRMatrix.indptr[Rend]] ) D = PETSc.Mat().createAIJ(size=CSRMatrix.shape, csr=csr) D.assemble() # Free the memory del CSRDynamicalMatrix_nn # Return the PETSc dynamical matrix return D Am Mi., 24. Okt. 2018 um 18:46 Uhr schrieb Matthew Knepley < knepley at gmail.com>: > As it says, if you are looking at performance, you should configure using > --with-debugging=0. > > It says SLEPc is using 8GB. Is this what you see? > > Matt > > On Wed, Oct 24, 2018 at 12:30 PM Jan Grie?er > wrote: > >> I also run it with the -log_summary : >> ---------------------------------------------- PETSc Performance Summary: >> ---------------------------------------------- >> >> >> >> ########################################################## >> # # >> # WARNING!!! # >> # # >> # This code was compiled with a debugging option, # >> # To get timing results run ./configure # >> # using --with-debugging=no, the performance will # >> # be generally two or three times faster. # >> # # >> ########################################################## >> >> >> /work/ws/nemo/fr_jg1080-FreeSurface_Glass-0/GlassSystems/PeriodicSystems/N500000T0.001/SolveEigenvalueProblem_par/Test/Eigensolver_petsc_slepc_no_argparse.py >> on a arch-linux2-c-debug named int02.nemo.privat with 20 processors, by >> fr_jg1080 Wed Oct 24 18:26:30 2018 >> Using Petsc Release Version 3.9.4, Sep, 11, 2018 >> >> Max Max/Min Avg Total >> Time (sec): 7.474e+02 1.00000 7.474e+02 >> Objects: 3.600e+01 1.00000 3.600e+01 >> Flop: 1.090e+11 1.00346 1.089e+11 2.177e+12 >> Flop/sec: 1.459e+08 1.00346 1.457e+08 2.913e+09 >> Memory: 3.950e+08 1.00296 7.891e+09 >> MPI Messages: 3.808e+04 1.00000 3.808e+04 7.615e+05 >> MPI Message Lengths: 2.211e+10 1.00217 5.802e+05 4.419e+11 >> MPI Reductions: 1.023e+05 1.00000 >> >> Flop counting convention: 1 flop = 1 real number operation of type >> (multiply/divide/add/subtract) >> e.g., VecAXPY() for real vectors of length N >> --> 2N flop >> and VecAXPY() for complex vectors of length N >> --> 8N flop >> >> Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages >> --- -- Message Lengths -- -- Reductions -- >> Avg %Total Avg %Total counts >> %Total Avg %Total counts %Total >> 0: Main Stage: 7.4739e+02 100.0% 2.1773e+12 100.0% 7.615e+05 >> 100.0% 5.802e+05 100.0% 1.022e+05 100.0% >> >> >> ------------------------------------------------------------------------------------------------------------------------ >> See the 'Profiling' chapter of the users' manual for details on >> interpreting output. >> Phase summary info: >> Count: number of times phase was executed >> Time and Flop: Max - maximum over all processors >> Ratio - ratio of maximum to minimum over all processors >> Mess: number of messages sent >> Avg. len: average message length (bytes) >> Reduct: number of global reductions >> Global: entire computation >> Stage: stages of a computation. Set stages with PetscLogStagePush() >> and PetscLogStagePop(). >> %T - percent time in this phase %F - percent flop in this >> phase >> %M - percent messages in this phase %L - percent message >> lengths in this phase >> %R - percent reductions in this phase >> Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time >> over all processors) >> >> ------------------------------------------------------------------------------------------------------------------------ >> >> >> ########################################################## >> # # >> # WARNING!!! # >> # # >> # This code was compiled with a debugging option, # >> # To get timing results run ./configure # >> # using --with-debugging=no, the performance will # >> # be generally two or three times faster. # >> # # >> ########################################################## >> >> >> Event Count Time (sec) Flop >> --- Global --- --- Stage --- Total >> Max Ratio Max Ratio Max Ratio Mess Avg len >> Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s >> >> ------------------------------------------------------------------------------------------------------------------------ >> >> --- Event Stage 0: Main Stage >> >> BuildTwoSidedF 2 1.0 2.6670e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecSet 2 1.0 6.8650e-03 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecScatterBegin 2002 1.0 1.4380e+01 1.0 0.00e+00 0.0 7.6e+05 5.8e+05 >> 0.0e+00 2 0100100 0 2 0100100 0 0 >> VecScatterEnd 2002 1.0 3.7604e+01 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 4 0 0 0 0 4 0 0 0 0 0 >> VecSetRandom 1 1.0 1.6440e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatMult 2002 1.0 6.0846e+02 1.2 1.03e+11 1.0 7.6e+05 5.8e+05 >> 0.0e+00 71 94100100 0 71 94100100 0 3376 >> MatAssemblyBegin 3 1.0 2.8129e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatAssemblyEnd 3 1.0 8.5094e+00 1.0 0.00e+00 0.0 7.6e+02 1.5e+05 >> 3.6e+01 1 0 0 0 0 1 0 0 0 0 0 >> EPSSetUp 1 1.0 1.7351e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 7.6e+01 0 0 0 0 0 0 0 0 0 0 0 >> EPSSolve 1 1.0 6.7891e+02 1.0 1.09e+11 1.0 7.6e+05 5.8e+05 >> 1.0e+05 91100100100100 91100100100100 3207 >> STSetUp 1 1.0 2.2221e-04 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 >> 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> STApply 2002 1.0 6.0879e+02 1.2 1.03e+11 1.0 7.6e+05 5.8e+05 >> 0.0e+00 71 94100100 0 71 94100100 0 3374 >> BVCopy 999 1.0 2.7157e-01 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> BVMultVec 4004 1.0 2.2918e+00 1.0 2.10e+09 1.0 0.0e+00 0.0e+00 >> 1.6e+04 0 2 0 0 16 0 2 0 0 16 18332 >> BVMultInPlace 999 1.0 4.8399e+01 1.0 1.20e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 6 1 0 0 0 6 1 0 0 0 495 >> BVDotVec 4004 1.0 1.0835e+01 1.0 2.70e+09 1.0 0.0e+00 0.0e+00 >> 2.0e+04 1 2 0 0 20 1 2 0 0 20 4986 >> BVOrthogonalizeV 2003 1.0 1.3272e+01 1.0 4.80e+09 1.0 0.0e+00 0.0e+00 >> 5.2e+04 2 4 0 0 51 2 4 0 0 51 7236 >> BVScale 2003 1.0 2.3521e-01 1.0 1.50e+08 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 12773 >> BVSetRandom 1 1.0 1.6456e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 4.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> DSSolve 1000 1.0 3.3338e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> DSVectors 1000 1.0 6.0029e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> DSOther 2999 1.0 7.8770e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> >> ------------------------------------------------------------------------------------------------------------------------ >> >> Memory usage is given in bytes: >> >> Object Type Creations Destructions Memory Descendants' >> Mem. >> Reports information only for process 0. >> >> --- Event Stage 0: Main Stage >> >> Viewer 2 1 840 0. >> PetscRandom 1 1 646 0. >> Index Set 4 4 5510472 0. >> Vector 9 9 11629608 0. >> Vec Scatter 2 2 1936 0. >> Matrix 10 10 331855732 0. >> Preconditioner 1 1 1000 0. >> Krylov Solver 1 1 1176 0. >> EPS Solver 1 1 1600 0. >> Spectral Transform 2 2 1624 0. >> Basis Vectors 1 1 2168 0. >> Direct Solver 1 1 2520 0. >> Region 1 1 672 0. >> >> ======================================================================================================================== >> Average time to get PetscTime(): 1.19209e-07 >> Average time for MPI_Barrier(): 2.67982e-05 >> Average time for zero size MPI_Send(): 1.08957e-05 >> #PETSc Option Table entries: >> -bv_type mat >> -eps_view_pre >> -log_summary >> #End of PETSc Option Table entries >> Compiled without FORTRAN kernels >> Compiled with full precision matrices (default) >> sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 >> sizeof(PetscScalar) 8 sizeof(PetscInt) 4 >> Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 >> --download-mumps --with-shared-libraries=True --download-scalapack >> ----------------------------------------- >> Libraries compiled on 2018-10-17 20:02:31 on login2.nemo.privat >> Machine characteristics: >> Linux-3.10.0-693.21.1.el7.x86_64-x86_64-with-centos-7.4.1708-Core >> Using PETSc directory: /home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4 >> Using PETSc arch: arch-linux2-c-debug >> ----------------------------------------- >> >> Using C compiler: mpicc -fPIC -wd1572 -g >> Using Fortran compiler: mpif90 -fPIC -g >> ----------------------------------------- >> >> Using include paths: >> -I/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/include >> -I/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/include >> ----------------------------------------- >> >> Using C linker: mpicc >> Using Fortran linker: mpif90 >> Using libraries: >> -Wl,-rpath,/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/lib >> -L/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/lib >> -lpetsc >> -Wl,-rpath,/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/lib >> -L/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/lib >> -Wl,-rpath,/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/mpi/intel64/lib/debug_mt >> -L/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/mpi/intel64/lib/debug_mt >> -Wl,-rpath,/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/mpi/intel64/lib >> -L/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/mpi/intel64/lib >> -Wl,-rpath,/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/lib/intel64 >> -L/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/lib/intel64 >> -Wl,-rpath,/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries_2018.3.222/linux/compiler/lib/intel64_lin >> -L/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries_2018.3.222/linux/compiler/lib/intel64_lin >> -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 >> -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 >> -Wl,-rpath,/opt/intel/mpi-rt/2017.0.0/intel64/lib/debug_mt >> -Wl,-rpath,/opt/intel/mpi-rt/2017.0.0/intel64/lib -lcmumps -ldmumps >> -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -llapack -lblas -lX11 >> -lstdc++ -ldl -lmpifort -lmpi -lmpigi -lrt -lpthread -lifport >> -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lgcc_s -lirc_s -lstdc++ -ldl >> ----------------------------------------- >> >> >> >> ########################################################## >> # # >> # WARNING!!! # >> # # >> # This code was compiled with a debugging option, # >> # To get timing results run ./configure # >> # using --with-debugging=no, the performance will # >> # be generally two or three times faster. # >> # # >> ########################################################## >> >> >> Am Mi., 24. Okt. 2018 um 18:07 Uhr schrieb Jan Grie?er < >> griesser.jan at googlemail.com>: >> >>> For some reason i get only this error message, also when is use the >>> -eps_view_pre. I started the program with nev=1, there the output is (with >>> -bv_type vecs -bv_type mat -eps_view_pre) >>> EPS Object: 20 MPI processes >>> type: krylovschur >>> 50% of basis vectors kept after restart >>> using the locking variant >>> problem type: symmetric eigenvalue problem >>> selected portion of the spectrum: smallest real parts >>> number of eigenvalues (nev): 1 >>> number of column vectors (ncv): 3 >>> maximum dimension of projected problem (mpd): 2 >>> maximum number of iterations: 1000 >>> tolerance: 1e-08 >>> convergence test: relative to the eigenvalue >>> BV Object: 20 MPI processes >>> type: mat >>> 4 columns of global length 1500000 >>> vector orthogonalization method: classical Gram-Schmidt >>> orthogonalization refinement: if needed (eta: 0.7071) >>> block orthogonalization method: GS >>> doing matmult as a single matrix-matrix product >>> DS Object: 20 MPI processes >>> type: hep >>> parallel operation mode: REDUNDANT >>> solving the problem with: Implicit QR method (_steqr) >>> ST Object: 20 MPI processes >>> type: shift >>> shift: 0. >>> number of matrices: 1 >>> >>> >>> >>> >>> Am Mi., 24. Okt. 2018 um 16:14 Uhr schrieb Matthew Knepley < >>> knepley at gmail.com>: >>> >>>> On Wed, Oct 24, 2018 at 10:03 AM Jan Grie?er < >>>> griesser.jan at googlemail.com> wrote: >>>> >>>>> This is the error message i get from my program: >>>>> Shape of the dynamical matrix: (1500000, 1500000) >>>>> >>>> >>>> Petsc installs a signal handler, so there should be a PETSc-specific >>>> message before this one. Is something eating >>>> your output? >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> >>>>> =================================================================================== >>>>> = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES >>>>> = PID 122676 RUNNING AT n3512.nemo.privat >>>>> = EXIT CODE: 9 >>>>> = CLEANING UP REMAINING PROCESSES >>>>> = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES >>>>> >>>>> =================================================================================== >>>>> Intel(R) MPI Library troubleshooting guide: >>>>> https://software.intel.com/node/561764 >>>>> >>>>> =================================================================================== >>>>> >>>>> >>>>> Am Mi., 24. Okt. 2018 um 16:01 Uhr schrieb Matthew Knepley < >>>>> knepley at gmail.com>: >>>>> >>>>>> On Wed, Oct 24, 2018 at 9:38 AM Jan Grie?er < >>>>>> griesser.jan at googlemail.com> wrote: >>>>>> >>>>>>> Hey, >>>>>>> i tried to run my program as you said with -bv_type vecs and/or >>>>>>> -bv_type mat, but instead of the PETScInt overflow i now get an MPI Error 9 >>>>>>> >>>>>> >>>>>> Send the actual error. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> message, which i assume (after googling a little bit around) should >>>>>>> be a memory problem. I tried to run it also on slightly bigger compute >>>>>>> nodes on our cluster with 20 cores with each 12 GB and 24 GB but the >>>>>>> problem still remains. The actual limit appears to be a 1.5 Million x 1.5 >>>>>>> Million where i searched for NEV = 1500 and MPD = 500/ 200 eigenvalues. >>>>>>> Do you have maybe an idea what the error could be? I appended also >>>>>>> the python method i used to solve the problem. I also tried to solve the >>>>>>> problem with spectrum solving but the error message remains the same. >>>>>>> >>>>>>> def solve_eigensystem(DynMatrix_nn, NEV, MPD, Dimension): >>>>>>> # Create the solver >>>>>>> # E is used as an acronym for the EPS solver (EPS = Eigenvalue >>>>>>> problem solver) >>>>>>> E = SLEPc.EPS().create() >>>>>>> >>>>>>> # Set the preconditioner >>>>>>> pc = PETSc.PC().create() >>>>>>> pc.setType(pc.Type.BJACOBI) >>>>>>> >>>>>>> # Set the linear solver >>>>>>> # Create the KSP object >>>>>>> ksp = PETSc.KSP().create() >>>>>>> # Create the solver, in this case GMRES >>>>>>> ksp.setType(ksp.Type.GMRES) >>>>>>> # Set the tolerances of the GMRES solver >>>>>>> # Link it to the PC >>>>>>> ksp.setPC(pc) >>>>>>> >>>>>>> # Set up the spectral transformations >>>>>>> st = SLEPc.ST().create() >>>>>>> st.setType("shift") >>>>>>> st.setKSP(ksp) >>>>>>> # MPD stands for "maximum projected dimension". It has to due with >>>>>>> computational cost, please read Chap. 2.6.5 of SLEPc docu for >>>>>>> # an explanation. At the moment mpd is only a guess >>>>>>> E.setDimensions(nev=NEV, mpd = MPD) >>>>>>> # Eigenvalues should be real, therefore we start to order them from >>>>>>> the smallest real value |l.real| >>>>>>> E.setWhichEigenpairs(E.Which.SMALLEST_REAL) >>>>>>> # Since the dynamical matrix is symmetric and real it is hermitian >>>>>>> E.setProblemType(SLEPc.EPS.ProblemType.HEP) >>>>>>> # Use the Krylov Schur method to solve the eigenvalue problem >>>>>>> E.setType(E.Type.KRYLOVSCHUR) >>>>>>> # Set the convergence criterion to relative to the eigenvalue and >>>>>>> the maximal number of iterations >>>>>>> E.setConvergenceTest(E.Conv.REL) >>>>>>> E.setTolerances(tol = 1e-8, max_it = 5000) >>>>>>> # Set the matrix in order to solve >>>>>>> E.setOperators(DynMatrix_nn, None) >>>>>>> # Sets EPS options from the options database. This routine must be >>>>>>> called before `setUp()` if the user is to be allowed to set dthe solver >>>>>>> type. >>>>>>> E.setFromOptions() >>>>>>> # Sets up all the internal data structures necessary for the >>>>>>> execution of the eigensolver. >>>>>>> E.setUp() >>>>>>> >>>>>>> # Solve eigenvalue problem >>>>>>> E.solve() >>>>>>> >>>>>>> Print = PETSc.Sys.Print >>>>>>> >>>>>>> Print() >>>>>>> Print("****************************") >>>>>>> Print("***SLEPc Solution Results***") >>>>>>> Print("****************************") >>>>>>> >>>>>>> its = E.getIterationNumber() >>>>>>> Print("Number of iterations of the method: ", its) >>>>>>> eps_type = E.getType() >>>>>>> Print("Solution method: ", eps_type) >>>>>>> nev, ncv, mpd = E.getDimensions() >>>>>>> Print("Number of requested eigenvalues: ", nev) >>>>>>> Print("Number of computeded eigenvectors: ", ncv) >>>>>>> tol, maxit = E.getTolerances() >>>>>>> Print("Stopping condition: (tol, maxit)", (tol, maxit)) >>>>>>> # Get the type of convergence >>>>>>> conv_test = E.getConvergenceTest() >>>>>>> Print("Selected convergence test: ", conv_test) >>>>>>> # Get the used spectral transformation >>>>>>> get_st = E.getST() >>>>>>> Print("Selected spectral transformation: ", get_st) >>>>>>> # Get the applied direct solver >>>>>>> get_ksp = E.getDS() >>>>>>> Print("Selected direct solver: ", get_ksp) >>>>>>> nconv = E.getConverged() >>>>>>> Print("Number of converged eigenpairs: ", nconv) >>>>>>> ..... >>>>>>> >>>>>>> >>>>>>> >>>>>>> Am Fr., 19. Okt. 2018 um 21:00 Uhr schrieb Smith, Barry F. < >>>>>>> bsmith at mcs.anl.gov>: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> > On Oct 19, 2018, at 7:56 AM, Zhang, Junchao >>>>>>>> wrote: >>>>>>>> > >>>>>>>> > >>>>>>>> > On Fri, Oct 19, 2018 at 4:02 AM Jan Grie?er < >>>>>>>> griesser.jan at googlemail.com> wrote: >>>>>>>> > With more than 1 MPI process you mean i should use spectrum >>>>>>>> slicing in divide the full problem in smaller subproblems? >>>>>>>> > The --with-64-bit-indices is not a possibility for me since i >>>>>>>> configured petsc with mumps, which does not allow to use the 64-bit version >>>>>>>> (At least this was the error message when i tried to configure PETSc ) >>>>>>>> > >>>>>>>> > MUMPS 5.1.2 manual chapter 2.4.2 says it supports "Selective >>>>>>>> 64-bit integer feature" and "full 64-bit integer version" as well. >>>>>>>> >>>>>>>> They use to achieve this by compiling with special Fortran >>>>>>>> flags to promote integers to 64 bit; this is too fragile for our taste so >>>>>>>> we never hooked PETSc up wit it. If they have a dependable way of using 64 >>>>>>>> bit integers we should add that to our mumps.py and test it. >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> > >>>>>>>> > Am Mi., 17. Okt. 2018 um 18:24 Uhr schrieb Jose E. Roman < >>>>>>>> jroman at dsic.upv.es>: >>>>>>>> > To use BVVECS just add the command-line option -bv_type vecs >>>>>>>> > This causes to use a separate Vec for each column, instead of a >>>>>>>> single long Vec of size n*m. But it is considerably slower than the default. >>>>>>>> > >>>>>>>> > Anyway, for such large problems you should consider using more >>>>>>>> than 1 MPI process. In that case the error may disappear because the local >>>>>>>> size is smaller than 768000. >>>>>>>> > >>>>>>>> > Jose >>>>>>>> > >>>>>>>> > >>>>>>>> > > El 17 oct 2018, a las 17:58, Matthew Knepley >>>>>>>> escribi?: >>>>>>>> > > >>>>>>>> > > On Wed, Oct 17, 2018 at 11:54 AM Jan Grie?er < >>>>>>>> griesser.jan at googlemail.com> wrote: >>>>>>>> > > Hi all, >>>>>>>> > > i am using slepc4py and petsc4py to solve for the smallest real >>>>>>>> eigenvalues and eigenvectors. For my test cases with a matrix A of the size >>>>>>>> 30k x 30k solving for the smallest soutions works quite well, but when i >>>>>>>> increase the dimension of my system to around A = 768000 x 768000 or 3 >>>>>>>> million x 3 million and ask for the smallest real 3000 (the number is >>>>>>>> increasing with increasing system size) eigenvalues and eigenvectors i get >>>>>>>> the output (for the 768000): >>>>>>>> > > The product 4001 times 768000 overflows the size of PetscInt; >>>>>>>> consider reducing the number of columns, or use BVVECS instead >>>>>>>> > > i understand that the requested number of eigenvectors and >>>>>>>> eigenvalues is causing an overflow but i do not understand the solution of >>>>>>>> the problem which is stated in the error message. Can someone tell me what >>>>>>>> exactly BVVECS is and how i can use it? Or is there any other solution to >>>>>>>> my problem ? >>>>>>>> > > >>>>>>>> > > You can also reconfigure with 64-bit integers: >>>>>>>> --with-64-bit-indices >>>>>>>> > > >>>>>>>> > > Thanks, >>>>>>>> > > >>>>>>>> > > Matt >>>>>>>> > > >>>>>>>> > > Thank you very much in advance, >>>>>>>> > > Jan >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > -- >>>>>>>> > > What most experimenters take for granted before they begin >>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>> their experiments lead. >>>>>>>> > > -- Norbert Wiener >>>>>>>> > > >>>>>>>> > > https://www.cse.buffalo.edu/~knepley/ >>>>>>>> > >>>>>>>> >>>>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amfoggia at gmail.com Tue Nov 13 06:47:13 2018 From: amfoggia at gmail.com (Ale Foggia) Date: Tue, 13 Nov 2018 13:47:13 +0100 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence In-Reply-To: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> References: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> Message-ID: Thanks Jose for the answer. I tried it an it worked! I send the output of -eps_view again, just in case: EPS Object: 1024 MPI processes type: krylovschur 50% of basis vectors kept after restart using the locking variant problem type: non-symmetric eigenvalue problem selected portion of the spectrum: smallest real parts number of eigenvalues (nev): 1 number of column vectors (ncv): 16 maximum dimension of projected problem (mpd): 16 maximum number of iterations: 291700777 tolerance: 1e-09 convergence test: relative to the eigenvalue BV Object: 1024 MPI processes type: svec 17 columns of global length 2333606220 vector orthogonalization method: classical Gram-Schmidt orthogonalization refinement: if needed (eta: 0.7071) block orthogonalization method: GS doing matmult as a single matrix-matrix product DS Object: 1024 MPI processes type: nhep parallel operation mode: REDUNDANT ST Object: 1024 MPI processes type: shift shift: 0. number of matrices: 1 k ||Ax-kx||/||kx|| ----------------- ------------------ -15.048025 1.85112e-10 -15.047159 3.13104e-10 Iterations performed 18 Why does treating it as non-Hermitian help the convergence? Why doesn't this happen with Lanczos? I'm lost :/ Ale El mar., 13 nov. 2018 a las 12:34, Jose E. Roman () escribi?: > This is really strange. We cannot say what is going on, everything seems > fine. > Could you try solving the problem as non-Hermitian to see what happens? > Just run with -eps_non_hermitian. Depending on the result, we can suggest > other things to try. > Jose > > > > El 13 nov 2018, a las 10:58, Ale Foggia via petsc-users < > petsc-users at mcs.anl.gov> escribi?: > > > > Hello, > > > > I'm using SLEPc to get the smallest real eigenvalue (EPS_SMALLEST_REAL) > of a Hermitian problem (EPS_HEP). The linear size of the matrices I'm > solving is around 10**9 elements and they are sparse. I've asked a few > questions before regarding the same problem setting and you suggested me to > use Krylov-Schur (because I was using Lanczos). I tried KS and up to a > certain matrix size the convergence (relative to the eigenvalue) is good, > it's around 10**-9, like with Lanczos, but when I increase the size I start > getting the eigenvalue with only 3 correct digits. I've used the options: > -eps_tol 1e-9 -eps_mpd 100 (16 was the default), but the only thing I got > is one more eigenvalue with the same big error, and the iterations > performed were only 2. Why didn't it do more in order to reach the > convergence? Should I set other parameters? I don't know how to work out > this problem, can you help me with this please? I send the -eps_view output > and the eigenvalues with its errors: > > > > EPS Object: 2048 MPI processes > > type: krylovschur > > 50% of basis vectors kept after restart > > using the locking variant > > problem type: symmetric eigenvalue problem > > selected portion of the spectrum: smallest real parts > > number of eigenvalues (nev): 1 > > number of column vectors (ncv): 101 > > maximum dimension of projected problem (mpd): 100 > > maximum number of iterations: 46210024 > > tolerance: 1e-09 > > convergence test: relative to the eigenvalue > > BV Object: 2048 MPI processes > > type: svec > > 102 columns of global length 2333606220 > > vector orthogonalization method: classical Gram-Schmidt > > orthogonalization refinement: if needed (eta: 0.7071) > > block orthogonalization method: GS > > doing matmult as a single matrix-matrix product > > DS Object: 2048 MPI processes > > type: hep > > parallel operation mode: REDUNDANT > > solving the problem with: Implicit QR method (_steqr) > > ST Object: 2048 MPI processes > > type: shift > > shift: 0. > > number of matrices: 1 > > > > k ||Ax-kx||/||kx|| > > ----------------- ------------------ > > -15.093051 0.00323917 (with KS) > > -15.087320 0.00265215 (with KS) > > -15.048025 8.67204e-09 (with Lanczos) > > Iterations performed 2 > > > > Ale > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Nov 13 06:56:42 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 13 Nov 2018 07:56:42 -0500 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence In-Reply-To: References: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> Message-ID: On Tue, Nov 13, 2018 at 7:48 AM Ale Foggia via petsc-users < petsc-users at mcs.anl.gov> wrote: > Thanks Jose for the answer. I tried it an it worked! I send the output of > -eps_view again, just in case: > > EPS Object: 1024 MPI processes > type: krylovschur > 50% of basis vectors kept after restart > using the locking variant > problem type: non-symmetric eigenvalue problem > selected portion of the spectrum: smallest real parts > number of eigenvalues (nev): 1 > number of column vectors (ncv): 16 > maximum dimension of projected problem (mpd): 16 > maximum number of iterations: 291700777 > tolerance: 1e-09 > convergence test: relative to the eigenvalue > BV Object: 1024 MPI processes > type: svec > 17 columns of global length 2333606220 > vector orthogonalization method: classical Gram-Schmidt > orthogonalization refinement: if needed (eta: 0.7071) > block orthogonalization method: GS > doing matmult as a single matrix-matrix product > DS Object: 1024 MPI processes > type: nhep > parallel operation mode: REDUNDANT > ST Object: 1024 MPI processes > type: shift > shift: 0. > number of matrices: 1 > > k ||Ax-kx||/||kx|| > ----------------- ------------------ > -15.048025 1.85112e-10 > -15.047159 3.13104e-10 > > Iterations performed 18 > > Why does treating it as non-Hermitian help the convergence? Why doesn't > this happen with Lanczos? I'm lost :/ > Are you sure your matrix is Hermitian? Lanczos will work for non-Hermitian things (I believe). Thanks, Matt > > Ale > > El mar., 13 nov. 2018 a las 12:34, Jose E. Roman () > escribi?: > >> This is really strange. We cannot say what is going on, everything seems >> fine. >> Could you try solving the problem as non-Hermitian to see what happens? >> Just run with -eps_non_hermitian. Depending on the result, we can suggest >> other things to try. >> Jose >> >> >> > El 13 nov 2018, a las 10:58, Ale Foggia via petsc-users < >> petsc-users at mcs.anl.gov> escribi?: >> > >> > Hello, >> > >> > I'm using SLEPc to get the smallest real eigenvalue (EPS_SMALLEST_REAL) >> of a Hermitian problem (EPS_HEP). The linear size of the matrices I'm >> solving is around 10**9 elements and they are sparse. I've asked a few >> questions before regarding the same problem setting and you suggested me to >> use Krylov-Schur (because I was using Lanczos). I tried KS and up to a >> certain matrix size the convergence (relative to the eigenvalue) is good, >> it's around 10**-9, like with Lanczos, but when I increase the size I start >> getting the eigenvalue with only 3 correct digits. I've used the options: >> -eps_tol 1e-9 -eps_mpd 100 (16 was the default), but the only thing I got >> is one more eigenvalue with the same big error, and the iterations >> performed were only 2. Why didn't it do more in order to reach the >> convergence? Should I set other parameters? I don't know how to work out >> this problem, can you help me with this please? I send the -eps_view output >> and the eigenvalues with its errors: >> > >> > EPS Object: 2048 MPI processes >> > type: krylovschur >> > 50% of basis vectors kept after restart >> > using the locking variant >> > problem type: symmetric eigenvalue problem >> > selected portion of the spectrum: smallest real parts >> > number of eigenvalues (nev): 1 >> > number of column vectors (ncv): 101 >> > maximum dimension of projected problem (mpd): 100 >> > maximum number of iterations: 46210024 >> > tolerance: 1e-09 >> > convergence test: relative to the eigenvalue >> > BV Object: 2048 MPI processes >> > type: svec >> > 102 columns of global length 2333606220 >> > vector orthogonalization method: classical Gram-Schmidt >> > orthogonalization refinement: if needed (eta: 0.7071) >> > block orthogonalization method: GS >> > doing matmult as a single matrix-matrix product >> > DS Object: 2048 MPI processes >> > type: hep >> > parallel operation mode: REDUNDANT >> > solving the problem with: Implicit QR method (_steqr) >> > ST Object: 2048 MPI processes >> > type: shift >> > shift: 0. >> > number of matrices: 1 >> > >> > k ||Ax-kx||/||kx|| >> > ----------------- ------------------ >> > -15.093051 0.00323917 (with KS) >> > -15.087320 0.00265215 (with KS) >> > -15.048025 8.67204e-09 (with Lanczos) >> > Iterations performed 2 >> > >> > Ale >> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Nov 13 06:58:05 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 13 Nov 2018 07:58:05 -0500 Subject: [petsc-users] PetscInt overflow In-Reply-To: References: <97941479-55EC-4D8B-B8D4-13675EBBF628@dsic.upv.es> Message-ID: On Tue, Nov 13, 2018 at 6:40 AM Jan Grie?er wrote: > I am finally back from holidays and continued working on the problem. I > think i figured out that the problem is not related to SLEPc but rather to > PETSc. Until now i precomputed my matrix in python and saved it as a > scipy.csr matrix( The matrix is of order 6 Mio x 6 Mio with around 0.004% > nonzero elements). The code i used to convert the scipy sparse matrix to a > PETSc matrix is appended below. But i am wondering if maybe the problem is > that the memory of the individual processors(i use at the moment 60 cores > with 24 GB each ) is simply dumped by the size of the matrix (It is around > 13 Gb) or if the PETSc matrix is not correctly distributed. > I am sorry for the late answer and hope someone has any suggestion. > The code below makes it look like you are loading the entire CSR matrix on every process (load_npz) and then selecting a portion of it to stick into PETSc. The memory footprint of this strategy will explode. Thanks, Matt > def construct_mat(): > D_nn = > scipy.sparse.load_npz("../PBC_DynamicalMatrix/DynamicalMatrixPB.npz") > # Create a matrix Ds in parallel > Dyn = PETSc.Mat().create() > Dyn.setSizes(CSRMatrix.shape) > Dyn.setFromOptions() > Dyn.setUp() > Rstart, Rend = Dyn.getOwnershipRange() > # Fill the matrix > csr = ( > CSRMatrix.indptr[Rstart:Rend+1] - CSRMatrix.indptr[Rstart], > CSRMatrix.indices[CSRMatrix.indptr[Rstart]:CSRMatrix.indptr[Rend]], > CSRMatrix.data[CSRMatrix.indptr[Rstart]:CSRMatrix.indptr[Rend]] > ) > D = PETSc.Mat().createAIJ(size=CSRMatrix.shape, csr=csr) > D.assemble() > > # Free the memory > del CSRDynamicalMatrix_nn > > # Return the PETSc dynamical matrix > return D > > Am Mi., 24. Okt. 2018 um 18:46 Uhr schrieb Matthew Knepley < > knepley at gmail.com>: > >> As it says, if you are looking at performance, you should configure using >> --with-debugging=0. >> >> It says SLEPc is using 8GB. Is this what you see? >> >> Matt >> >> On Wed, Oct 24, 2018 at 12:30 PM Jan Grie?er >> wrote: >> >>> I also run it with the -log_summary : >>> ---------------------------------------------- PETSc Performance >>> Summary: ---------------------------------------------- >>> >>> >>> >>> ########################################################## >>> # # >>> # WARNING!!! # >>> # # >>> # This code was compiled with a debugging option, # >>> # To get timing results run ./configure # >>> # using --with-debugging=no, the performance will # >>> # be generally two or three times faster. # >>> # # >>> ########################################################## >>> >>> >>> /work/ws/nemo/fr_jg1080-FreeSurface_Glass-0/GlassSystems/PeriodicSystems/N500000T0.001/SolveEigenvalueProblem_par/Test/Eigensolver_petsc_slepc_no_argparse.py >>> on a arch-linux2-c-debug named int02.nemo.privat with 20 processors, by >>> fr_jg1080 Wed Oct 24 18:26:30 2018 >>> Using Petsc Release Version 3.9.4, Sep, 11, 2018 >>> >>> Max Max/Min Avg Total >>> Time (sec): 7.474e+02 1.00000 7.474e+02 >>> Objects: 3.600e+01 1.00000 3.600e+01 >>> Flop: 1.090e+11 1.00346 1.089e+11 2.177e+12 >>> Flop/sec: 1.459e+08 1.00346 1.457e+08 2.913e+09 >>> Memory: 3.950e+08 1.00296 7.891e+09 >>> MPI Messages: 3.808e+04 1.00000 3.808e+04 7.615e+05 >>> MPI Message Lengths: 2.211e+10 1.00217 5.802e+05 4.419e+11 >>> MPI Reductions: 1.023e+05 1.00000 >>> >>> Flop counting convention: 1 flop = 1 real number operation of type >>> (multiply/divide/add/subtract) >>> e.g., VecAXPY() for real vectors of length N >>> --> 2N flop >>> and VecAXPY() for complex vectors of length >>> N --> 8N flop >>> >>> Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages >>> --- -- Message Lengths -- -- Reductions -- >>> Avg %Total Avg %Total counts >>> %Total Avg %Total counts %Total >>> 0: Main Stage: 7.4739e+02 100.0% 2.1773e+12 100.0% 7.615e+05 >>> 100.0% 5.802e+05 100.0% 1.022e+05 100.0% >>> >>> >>> ------------------------------------------------------------------------------------------------------------------------ >>> See the 'Profiling' chapter of the users' manual for details on >>> interpreting output. >>> Phase summary info: >>> Count: number of times phase was executed >>> Time and Flop: Max - maximum over all processors >>> Ratio - ratio of maximum to minimum over all >>> processors >>> Mess: number of messages sent >>> Avg. len: average message length (bytes) >>> Reduct: number of global reductions >>> Global: entire computation >>> Stage: stages of a computation. Set stages with PetscLogStagePush() >>> and PetscLogStagePop(). >>> %T - percent time in this phase %F - percent flop in this >>> phase >>> %M - percent messages in this phase %L - percent message >>> lengths in this phase >>> %R - percent reductions in this phase >>> Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time >>> over all processors) >>> >>> ------------------------------------------------------------------------------------------------------------------------ >>> >>> >>> ########################################################## >>> # # >>> # WARNING!!! # >>> # # >>> # This code was compiled with a debugging option, # >>> # To get timing results run ./configure # >>> # using --with-debugging=no, the performance will # >>> # be generally two or three times faster. # >>> # # >>> ########################################################## >>> >>> >>> Event Count Time (sec) Flop >>> --- Global --- --- Stage --- Total >>> Max Ratio Max Ratio Max Ratio Mess Avg len >>> Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s >>> >>> ------------------------------------------------------------------------------------------------------------------------ >>> >>> --- Event Stage 0: Main Stage >>> >>> BuildTwoSidedF 2 1.0 2.6670e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> VecSet 2 1.0 6.8650e-03 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> VecScatterBegin 2002 1.0 1.4380e+01 1.0 0.00e+00 0.0 7.6e+05 5.8e+05 >>> 0.0e+00 2 0100100 0 2 0100100 0 0 >>> VecScatterEnd 2002 1.0 3.7604e+01 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 4 0 0 0 0 4 0 0 0 0 0 >>> VecSetRandom 1 1.0 1.6440e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatMult 2002 1.0 6.0846e+02 1.2 1.03e+11 1.0 7.6e+05 5.8e+05 >>> 0.0e+00 71 94100100 0 71 94100100 0 3376 >>> MatAssemblyBegin 3 1.0 2.8129e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatAssemblyEnd 3 1.0 8.5094e+00 1.0 0.00e+00 0.0 7.6e+02 1.5e+05 >>> 3.6e+01 1 0 0 0 0 1 0 0 0 0 0 >>> EPSSetUp 1 1.0 1.7351e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 7.6e+01 0 0 0 0 0 0 0 0 0 0 0 >>> EPSSolve 1 1.0 6.7891e+02 1.0 1.09e+11 1.0 7.6e+05 5.8e+05 >>> 1.0e+05 91100100100100 91100100100100 3207 >>> STSetUp 1 1.0 2.2221e-04 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> STApply 2002 1.0 6.0879e+02 1.2 1.03e+11 1.0 7.6e+05 5.8e+05 >>> 0.0e+00 71 94100100 0 71 94100100 0 3374 >>> BVCopy 999 1.0 2.7157e-01 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> BVMultVec 4004 1.0 2.2918e+00 1.0 2.10e+09 1.0 0.0e+00 0.0e+00 >>> 1.6e+04 0 2 0 0 16 0 2 0 0 16 18332 >>> BVMultInPlace 999 1.0 4.8399e+01 1.0 1.20e+09 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 6 1 0 0 0 6 1 0 0 0 495 >>> BVDotVec 4004 1.0 1.0835e+01 1.0 2.70e+09 1.0 0.0e+00 0.0e+00 >>> 2.0e+04 1 2 0 0 20 1 2 0 0 20 4986 >>> BVOrthogonalizeV 2003 1.0 1.3272e+01 1.0 4.80e+09 1.0 0.0e+00 0.0e+00 >>> 5.2e+04 2 4 0 0 51 2 4 0 0 51 7236 >>> BVScale 2003 1.0 2.3521e-01 1.0 1.50e+08 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 12773 >>> BVSetRandom 1 1.0 1.6456e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 4.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> DSSolve 1000 1.0 3.3338e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> DSVectors 1000 1.0 6.0029e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> DSOther 2999 1.0 7.8770e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> >>> ------------------------------------------------------------------------------------------------------------------------ >>> >>> Memory usage is given in bytes: >>> >>> Object Type Creations Destructions Memory Descendants' >>> Mem. >>> Reports information only for process 0. >>> >>> --- Event Stage 0: Main Stage >>> >>> Viewer 2 1 840 0. >>> PetscRandom 1 1 646 0. >>> Index Set 4 4 5510472 0. >>> Vector 9 9 11629608 0. >>> Vec Scatter 2 2 1936 0. >>> Matrix 10 10 331855732 0. >>> Preconditioner 1 1 1000 0. >>> Krylov Solver 1 1 1176 0. >>> EPS Solver 1 1 1600 0. >>> Spectral Transform 2 2 1624 0. >>> Basis Vectors 1 1 2168 0. >>> Direct Solver 1 1 2520 0. >>> Region 1 1 672 0. >>> >>> ======================================================================================================================== >>> Average time to get PetscTime(): 1.19209e-07 >>> Average time for MPI_Barrier(): 2.67982e-05 >>> Average time for zero size MPI_Send(): 1.08957e-05 >>> #PETSc Option Table entries: >>> -bv_type mat >>> -eps_view_pre >>> -log_summary >>> #End of PETSc Option Table entries >>> Compiled without FORTRAN kernels >>> Compiled with full precision matrices (default) >>> sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 >>> sizeof(PetscScalar) 8 sizeof(PetscInt) 4 >>> Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 >>> --download-mumps --with-shared-libraries=True --download-scalapack >>> ----------------------------------------- >>> Libraries compiled on 2018-10-17 20:02:31 on login2.nemo.privat >>> Machine characteristics: >>> Linux-3.10.0-693.21.1.el7.x86_64-x86_64-with-centos-7.4.1708-Core >>> Using PETSc directory: /home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4 >>> Using PETSc arch: arch-linux2-c-debug >>> ----------------------------------------- >>> >>> Using C compiler: mpicc -fPIC -wd1572 -g >>> Using Fortran compiler: mpif90 -fPIC -g >>> ----------------------------------------- >>> >>> Using include paths: >>> -I/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/include >>> -I/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/include >>> ----------------------------------------- >>> >>> Using C linker: mpicc >>> Using Fortran linker: mpif90 >>> Using libraries: >>> -Wl,-rpath,/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/lib >>> -L/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/lib >>> -lpetsc >>> -Wl,-rpath,/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/lib >>> -L/home/fr/fr_fr/fr_jg1080/Libaries/petsc-3.9.4/arch-linux2-c-debug/lib >>> -Wl,-rpath,/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/mpi/intel64/lib/debug_mt >>> -L/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/mpi/intel64/lib/debug_mt >>> -Wl,-rpath,/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/mpi/intel64/lib >>> -L/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/mpi/intel64/lib >>> -Wl,-rpath,/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/lib/intel64 >>> -L/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries/linux/lib/intel64 >>> -Wl,-rpath,/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries_2018.3.222/linux/compiler/lib/intel64_lin >>> -L/opt/bwhpc/common/compiler/intel/2018.3.222/compilers_and_libraries_2018.3.222/linux/compiler/lib/intel64_lin >>> -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 >>> -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 >>> -Wl,-rpath,/opt/intel/mpi-rt/2017.0.0/intel64/lib/debug_mt >>> -Wl,-rpath,/opt/intel/mpi-rt/2017.0.0/intel64/lib -lcmumps -ldmumps >>> -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -llapack -lblas -lX11 >>> -lstdc++ -ldl -lmpifort -lmpi -lmpigi -lrt -lpthread -lifport >>> -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lgcc_s -lirc_s -lstdc++ -ldl >>> ----------------------------------------- >>> >>> >>> >>> ########################################################## >>> # # >>> # WARNING!!! # >>> # # >>> # This code was compiled with a debugging option, # >>> # To get timing results run ./configure # >>> # using --with-debugging=no, the performance will # >>> # be generally two or three times faster. # >>> # # >>> ########################################################## >>> >>> >>> Am Mi., 24. Okt. 2018 um 18:07 Uhr schrieb Jan Grie?er < >>> griesser.jan at googlemail.com>: >>> >>>> For some reason i get only this error message, also when is use the >>>> -eps_view_pre. I started the program with nev=1, there the output is (with >>>> -bv_type vecs -bv_type mat -eps_view_pre) >>>> EPS Object: 20 MPI processes >>>> type: krylovschur >>>> 50% of basis vectors kept after restart >>>> using the locking variant >>>> problem type: symmetric eigenvalue problem >>>> selected portion of the spectrum: smallest real parts >>>> number of eigenvalues (nev): 1 >>>> number of column vectors (ncv): 3 >>>> maximum dimension of projected problem (mpd): 2 >>>> maximum number of iterations: 1000 >>>> tolerance: 1e-08 >>>> convergence test: relative to the eigenvalue >>>> BV Object: 20 MPI processes >>>> type: mat >>>> 4 columns of global length 1500000 >>>> vector orthogonalization method: classical Gram-Schmidt >>>> orthogonalization refinement: if needed (eta: 0.7071) >>>> block orthogonalization method: GS >>>> doing matmult as a single matrix-matrix product >>>> DS Object: 20 MPI processes >>>> type: hep >>>> parallel operation mode: REDUNDANT >>>> solving the problem with: Implicit QR method (_steqr) >>>> ST Object: 20 MPI processes >>>> type: shift >>>> shift: 0. >>>> number of matrices: 1 >>>> >>>> >>>> >>>> >>>> Am Mi., 24. Okt. 2018 um 16:14 Uhr schrieb Matthew Knepley < >>>> knepley at gmail.com>: >>>> >>>>> On Wed, Oct 24, 2018 at 10:03 AM Jan Grie?er < >>>>> griesser.jan at googlemail.com> wrote: >>>>> >>>>>> This is the error message i get from my program: >>>>>> Shape of the dynamical matrix: (1500000, 1500000) >>>>>> >>>>> >>>>> Petsc installs a signal handler, so there should be a PETSc-specific >>>>> message before this one. Is something eating >>>>> your output? >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>> =================================================================================== >>>>>> = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES >>>>>> = PID 122676 RUNNING AT n3512.nemo.privat >>>>>> = EXIT CODE: 9 >>>>>> = CLEANING UP REMAINING PROCESSES >>>>>> = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES >>>>>> >>>>>> =================================================================================== >>>>>> Intel(R) MPI Library troubleshooting guide: >>>>>> https://software.intel.com/node/561764 >>>>>> >>>>>> =================================================================================== >>>>>> >>>>>> >>>>>> Am Mi., 24. Okt. 2018 um 16:01 Uhr schrieb Matthew Knepley < >>>>>> knepley at gmail.com>: >>>>>> >>>>>>> On Wed, Oct 24, 2018 at 9:38 AM Jan Grie?er < >>>>>>> griesser.jan at googlemail.com> wrote: >>>>>>> >>>>>>>> Hey, >>>>>>>> i tried to run my program as you said with -bv_type vecs and/or >>>>>>>> -bv_type mat, but instead of the PETScInt overflow i now get an MPI Error 9 >>>>>>>> >>>>>>> >>>>>>> Send the actual error. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> message, which i assume (after googling a little bit around) should >>>>>>>> be a memory problem. I tried to run it also on slightly bigger compute >>>>>>>> nodes on our cluster with 20 cores with each 12 GB and 24 GB but the >>>>>>>> problem still remains. The actual limit appears to be a 1.5 Million x 1.5 >>>>>>>> Million where i searched for NEV = 1500 and MPD = 500/ 200 eigenvalues. >>>>>>>> Do you have maybe an idea what the error could be? I appended also >>>>>>>> the python method i used to solve the problem. I also tried to solve the >>>>>>>> problem with spectrum solving but the error message remains the same. >>>>>>>> >>>>>>>> def solve_eigensystem(DynMatrix_nn, NEV, MPD, Dimension): >>>>>>>> # Create the solver >>>>>>>> # E is used as an acronym for the EPS solver (EPS = Eigenvalue >>>>>>>> problem solver) >>>>>>>> E = SLEPc.EPS().create() >>>>>>>> >>>>>>>> # Set the preconditioner >>>>>>>> pc = PETSc.PC().create() >>>>>>>> pc.setType(pc.Type.BJACOBI) >>>>>>>> >>>>>>>> # Set the linear solver >>>>>>>> # Create the KSP object >>>>>>>> ksp = PETSc.KSP().create() >>>>>>>> # Create the solver, in this case GMRES >>>>>>>> ksp.setType(ksp.Type.GMRES) >>>>>>>> # Set the tolerances of the GMRES solver >>>>>>>> # Link it to the PC >>>>>>>> ksp.setPC(pc) >>>>>>>> >>>>>>>> # Set up the spectral transformations >>>>>>>> st = SLEPc.ST().create() >>>>>>>> st.setType("shift") >>>>>>>> st.setKSP(ksp) >>>>>>>> # MPD stands for "maximum projected dimension". It has to due with >>>>>>>> computational cost, please read Chap. 2.6.5 of SLEPc docu for >>>>>>>> # an explanation. At the moment mpd is only a guess >>>>>>>> E.setDimensions(nev=NEV, mpd = MPD) >>>>>>>> # Eigenvalues should be real, therefore we start to order them from >>>>>>>> the smallest real value |l.real| >>>>>>>> E.setWhichEigenpairs(E.Which.SMALLEST_REAL) >>>>>>>> # Since the dynamical matrix is symmetric and real it is hermitian >>>>>>>> E.setProblemType(SLEPc.EPS.ProblemType.HEP) >>>>>>>> # Use the Krylov Schur method to solve the eigenvalue problem >>>>>>>> E.setType(E.Type.KRYLOVSCHUR) >>>>>>>> # Set the convergence criterion to relative to the eigenvalue and >>>>>>>> the maximal number of iterations >>>>>>>> E.setConvergenceTest(E.Conv.REL) >>>>>>>> E.setTolerances(tol = 1e-8, max_it = 5000) >>>>>>>> # Set the matrix in order to solve >>>>>>>> E.setOperators(DynMatrix_nn, None) >>>>>>>> # Sets EPS options from the options database. This routine must be >>>>>>>> called before `setUp()` if the user is to be allowed to set dthe solver >>>>>>>> type. >>>>>>>> E.setFromOptions() >>>>>>>> # Sets up all the internal data structures necessary for the >>>>>>>> execution of the eigensolver. >>>>>>>> E.setUp() >>>>>>>> >>>>>>>> # Solve eigenvalue problem >>>>>>>> E.solve() >>>>>>>> >>>>>>>> Print = PETSc.Sys.Print >>>>>>>> >>>>>>>> Print() >>>>>>>> Print("****************************") >>>>>>>> Print("***SLEPc Solution Results***") >>>>>>>> Print("****************************") >>>>>>>> >>>>>>>> its = E.getIterationNumber() >>>>>>>> Print("Number of iterations of the method: ", its) >>>>>>>> eps_type = E.getType() >>>>>>>> Print("Solution method: ", eps_type) >>>>>>>> nev, ncv, mpd = E.getDimensions() >>>>>>>> Print("Number of requested eigenvalues: ", nev) >>>>>>>> Print("Number of computeded eigenvectors: ", ncv) >>>>>>>> tol, maxit = E.getTolerances() >>>>>>>> Print("Stopping condition: (tol, maxit)", (tol, maxit)) >>>>>>>> # Get the type of convergence >>>>>>>> conv_test = E.getConvergenceTest() >>>>>>>> Print("Selected convergence test: ", conv_test) >>>>>>>> # Get the used spectral transformation >>>>>>>> get_st = E.getST() >>>>>>>> Print("Selected spectral transformation: ", get_st) >>>>>>>> # Get the applied direct solver >>>>>>>> get_ksp = E.getDS() >>>>>>>> Print("Selected direct solver: ", get_ksp) >>>>>>>> nconv = E.getConverged() >>>>>>>> Print("Number of converged eigenpairs: ", nconv) >>>>>>>> ..... >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> Am Fr., 19. Okt. 2018 um 21:00 Uhr schrieb Smith, Barry F. < >>>>>>>> bsmith at mcs.anl.gov>: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> > On Oct 19, 2018, at 7:56 AM, Zhang, Junchao >>>>>>>>> wrote: >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > On Fri, Oct 19, 2018 at 4:02 AM Jan Grie?er < >>>>>>>>> griesser.jan at googlemail.com> wrote: >>>>>>>>> > With more than 1 MPI process you mean i should use spectrum >>>>>>>>> slicing in divide the full problem in smaller subproblems? >>>>>>>>> > The --with-64-bit-indices is not a possibility for me since i >>>>>>>>> configured petsc with mumps, which does not allow to use the 64-bit version >>>>>>>>> (At least this was the error message when i tried to configure PETSc ) >>>>>>>>> > >>>>>>>>> > MUMPS 5.1.2 manual chapter 2.4.2 says it supports "Selective >>>>>>>>> 64-bit integer feature" and "full 64-bit integer version" as well. >>>>>>>>> >>>>>>>>> They use to achieve this by compiling with special Fortran >>>>>>>>> flags to promote integers to 64 bit; this is too fragile for our taste so >>>>>>>>> we never hooked PETSc up wit it. If they have a dependable way of using 64 >>>>>>>>> bit integers we should add that to our mumps.py and test it. >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> > >>>>>>>>> > Am Mi., 17. Okt. 2018 um 18:24 Uhr schrieb Jose E. Roman < >>>>>>>>> jroman at dsic.upv.es>: >>>>>>>>> > To use BVVECS just add the command-line option -bv_type vecs >>>>>>>>> > This causes to use a separate Vec for each column, instead of a >>>>>>>>> single long Vec of size n*m. But it is considerably slower than the default. >>>>>>>>> > >>>>>>>>> > Anyway, for such large problems you should consider using more >>>>>>>>> than 1 MPI process. In that case the error may disappear because the local >>>>>>>>> size is smaller than 768000. >>>>>>>>> > >>>>>>>>> > Jose >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > > El 17 oct 2018, a las 17:58, Matthew Knepley < >>>>>>>>> knepley at gmail.com> escribi?: >>>>>>>>> > > >>>>>>>>> > > On Wed, Oct 17, 2018 at 11:54 AM Jan Grie?er < >>>>>>>>> griesser.jan at googlemail.com> wrote: >>>>>>>>> > > Hi all, >>>>>>>>> > > i am using slepc4py and petsc4py to solve for the smallest >>>>>>>>> real eigenvalues and eigenvectors. For my test cases with a matrix A of the >>>>>>>>> size 30k x 30k solving for the smallest soutions works quite well, but when >>>>>>>>> i increase the dimension of my system to around A = 768000 x 768000 or 3 >>>>>>>>> million x 3 million and ask for the smallest real 3000 (the number is >>>>>>>>> increasing with increasing system size) eigenvalues and eigenvectors i get >>>>>>>>> the output (for the 768000): >>>>>>>>> > > The product 4001 times 768000 overflows the size of PetscInt; >>>>>>>>> consider reducing the number of columns, or use BVVECS instead >>>>>>>>> > > i understand that the requested number of eigenvectors and >>>>>>>>> eigenvalues is causing an overflow but i do not understand the solution of >>>>>>>>> the problem which is stated in the error message. Can someone tell me what >>>>>>>>> exactly BVVECS is and how i can use it? Or is there any other solution to >>>>>>>>> my problem ? >>>>>>>>> > > >>>>>>>>> > > You can also reconfigure with 64-bit integers: >>>>>>>>> --with-64-bit-indices >>>>>>>>> > > >>>>>>>>> > > Thanks, >>>>>>>>> > > >>>>>>>>> > > Matt >>>>>>>>> > > >>>>>>>>> > > Thank you very much in advance, >>>>>>>>> > > Jan >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > -- >>>>>>>>> > > What most experimenters take for granted before they begin >>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>> their experiments lead. >>>>>>>>> > > -- Norbert Wiener >>>>>>>>> > > >>>>>>>>> > > https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> > >>>>>>>>> >>>>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Tue Nov 13 07:52:40 2018 From: jroman at dsic.upv.es (Jose E. Roman) Date: Tue, 13 Nov 2018 14:52:40 +0100 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence In-Reply-To: References: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> Message-ID: As Matt points out, the probable cause is that your matrix is not exactly Hermitian. In our implementation of Lanczos we assume symmetry, so if ||A-A^*|| is non-negligible then some information is being discarded, which may confuse Lanczos giving some eigenpair as converged when it is not. Why doesn't this happen with the basic Lanczos solver? I don't know, probably due to the restart scheme. In symmetric Krylov-Schur, restart is done with eigenvectors of the projected problem (thick-restart Lanczos), while non-symmetric Krylov-Schur uses Schur vectors. In the basic Lanczos, restart is done building a new initial vector explicitly, which is harmless in this case. Suggestion: either use non-Hermitian Krylov-Schur or symmetrize your matrix as (A+A^*)/2 (or better when contructing it, note that transposing a parallel matrix with 1024 processes requires a lot of communication). Jose > El 13 nov 2018, a las 13:56, Matthew Knepley escribi?: > > On Tue, Nov 13, 2018 at 7:48 AM Ale Foggia via petsc-users wrote: > Thanks Jose for the answer. I tried it an it worked! I send the output of -eps_view again, just in case: > > EPS Object: 1024 MPI processes > type: krylovschur > 50% of basis vectors kept after restart > using the locking variant > problem type: non-symmetric eigenvalue problem > selected portion of the spectrum: smallest real parts > number of eigenvalues (nev): 1 > number of column vectors (ncv): 16 > maximum dimension of projected problem (mpd): 16 > maximum number of iterations: 291700777 > tolerance: 1e-09 > convergence test: relative to the eigenvalue > BV Object: 1024 MPI processes > type: svec > 17 columns of global length 2333606220 > vector orthogonalization method: classical Gram-Schmidt > orthogonalization refinement: if needed (eta: 0.7071) > block orthogonalization method: GS > doing matmult as a single matrix-matrix product > DS Object: 1024 MPI processes > type: nhep > parallel operation mode: REDUNDANT > ST Object: 1024 MPI processes > type: shift > shift: 0. > number of matrices: 1 > > k ||Ax-kx||/||kx|| > ----------------- ------------------ > -15.048025 1.85112e-10 > -15.047159 3.13104e-10 > > Iterations performed 18 > > Why does treating it as non-Hermitian help the convergence? Why doesn't this happen with Lanczos? I'm lost :/ > > Are you sure your matrix is Hermitian? Lanczos will work for non-Hermitian things (I believe). > > Thanks, > > Matt > > > Ale > > El mar., 13 nov. 2018 a las 12:34, Jose E. Roman () escribi?: > This is really strange. We cannot say what is going on, everything seems fine. > Could you try solving the problem as non-Hermitian to see what happens? Just run with -eps_non_hermitian. Depending on the result, we can suggest other things to try. > Jose > > > > El 13 nov 2018, a las 10:58, Ale Foggia via petsc-users escribi?: > > > > Hello, > > > > I'm using SLEPc to get the smallest real eigenvalue (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). The linear size of the matrices I'm solving is around 10**9 elements and they are sparse. I've asked a few questions before regarding the same problem setting and you suggested me to use Krylov-Schur (because I was using Lanczos). I tried KS and up to a certain matrix size the convergence (relative to the eigenvalue) is good, it's around 10**-9, like with Lanczos, but when I increase the size I start getting the eigenvalue with only 3 correct digits. I've used the options: -eps_tol 1e-9 -eps_mpd 100 (16 was the default), but the only thing I got is one more eigenvalue with the same big error, and the iterations performed were only 2. Why didn't it do more in order to reach the convergence? Should I set other parameters? I don't know how to work out this problem, can you help me with this please? I send the -eps_view output and the eigenvalues with its errors: > > > > EPS Object: 2048 MPI processes > > type: krylovschur > > 50% of basis vectors kept after restart > > using the locking variant > > problem type: symmetric eigenvalue problem > > selected portion of the spectrum: smallest real parts > > number of eigenvalues (nev): 1 > > number of column vectors (ncv): 101 > > maximum dimension of projected problem (mpd): 100 > > maximum number of iterations: 46210024 > > tolerance: 1e-09 > > convergence test: relative to the eigenvalue > > BV Object: 2048 MPI processes > > type: svec > > 102 columns of global length 2333606220 > > vector orthogonalization method: classical Gram-Schmidt > > orthogonalization refinement: if needed (eta: 0.7071) > > block orthogonalization method: GS > > doing matmult as a single matrix-matrix product > > DS Object: 2048 MPI processes > > type: hep > > parallel operation mode: REDUNDANT > > solving the problem with: Implicit QR method (_steqr) > > ST Object: 2048 MPI processes > > type: shift > > shift: 0. > > number of matrices: 1 > > > > k ||Ax-kx||/||kx|| > > ----------------- ------------------ > > -15.093051 0.00323917 (with KS) > > -15.087320 0.00265215 (with KS) > > -15.048025 8.67204e-09 (with Lanczos) > > Iterations performed 2 > > > > Ale > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From timothee.nicolas at gmail.com Tue Nov 13 08:13:31 2018 From: timothee.nicolas at gmail.com (=?UTF-8?Q?Timoth=C3=A9e_Nicolas?=) Date: Tue, 13 Nov 2018 15:13:31 +0100 Subject: [petsc-users] Sequential VecSetValues and VecAssembly Message-ID: Dear all, I realized our code has some calls to VecSetValues not followed by calls to VecAssemblyBegin/VecAssemblyEnd, and no errors are thrown, although the manual says they must be called. The corresponding vectors are sequential vectors created with VecCreateSeq. Is this normal behaviour? Do we still need to call VecAssemblyBegin/End in this case? And btw what does assembly exactly correspond to? Best regards Timothee -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Tue Nov 13 08:45:54 2018 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Tue, 13 Nov 2018 17:45:54 +0300 Subject: [petsc-users] Sequential VecSetValues and VecAssembly In-Reply-To: References: Message-ID: with sequential vectors (ie VECSEQ type), you don't need to call VecAssembly The VecAssembly calls are designed to communicate any off-process value set to the owner process (e.g with VECMPI) or to send the values to the GPU (VECCUDA). With VECSEQ, VecSetValues just writes the values you have specified into the raw memory of the vector. Anyway, it is good practice to add those calls, if in the future you would like to try a different class (parallel or that uses GPU) for which the VecAssembly call is needed. Il giorno mar 13 nov 2018 alle ore 17:15 Timoth?e Nicolas via petsc-users < petsc-users at mcs.anl.gov> ha scritto: > Dear all, > > I realized our code has some calls to VecSetValues not followed by calls > to VecAssemblyBegin/VecAssemblyEnd, and no errors are thrown, although the > manual says they must be called. The corresponding vectors are sequential > vectors created with VecCreateSeq. Is this normal behaviour? Do we still > need to call VecAssemblyBegin/End in this case? And btw what does assembly > exactly correspond to? > > Best regards > > Timothee > -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From timothee.nicolas at gmail.com Tue Nov 13 08:48:51 2018 From: timothee.nicolas at gmail.com (=?UTF-8?Q?Timoth=C3=A9e_Nicolas?=) Date: Tue, 13 Nov 2018 15:48:51 +0100 Subject: [petsc-users] Sequential VecSetValues and VecAssembly In-Reply-To: References: Message-ID: Thank you for the clarification. I did use parallel before and was using those calls indeed. Cheers Timoth?e Le mar. 13 nov. 2018 ? 15:46, Stefano Zampini a ?crit : > with sequential vectors (ie VECSEQ type), you don't need to call > VecAssembly > The VecAssembly calls are designed to communicate any off-process value > set to the owner process (e.g with VECMPI) or to send the values to the GPU > (VECCUDA). > With VECSEQ, VecSetValues just writes the values you have specified into > the raw memory of the vector. > Anyway, it is good practice to add those calls, if in the future you would > like to try a different class (parallel or that uses GPU) for which the > VecAssembly call is needed. > > Il giorno mar 13 nov 2018 alle ore 17:15 Timoth?e Nicolas via petsc-users < > petsc-users at mcs.anl.gov> ha scritto: > >> Dear all, >> >> I realized our code has some calls to VecSetValues not followed by calls >> to VecAssemblyBegin/VecAssemblyEnd, and no errors are thrown, although the >> manual says they must be called. The corresponding vectors are sequential >> vectors created with VecCreateSeq. Is this normal behaviour? Do we still >> need to call VecAssemblyBegin/End in this case? And btw what does assembly >> exactly correspond to? >> >> Best regards >> >> Timothee >> > > > -- > Stefano > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amfoggia at gmail.com Tue Nov 13 10:14:02 2018 From: amfoggia at gmail.com (Ale Foggia) Date: Tue, 13 Nov 2018 17:14:02 +0100 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence In-Reply-To: References: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> Message-ID: Thanks for the answers. I've checked with PETSc function MatIsSymmetric (mpiaij does not support MatIsHermitian) for the size that was giving the wrong eigenvalue and indeed it is not symmetric, for a tolerance of 1e-14. I also wanted to check for smaller sizes, for which the eigenvalue was correct, but I get a "Bus error, possibly illegal memory access" when the program arrives to that function call. Any idea why it works for one case and not for the other? I'm using PETSc 3.9.3 and SLEPc 3.9.2. The matrix has to be symmetric at all times, by construction and because the physics problem I'm dealing with implies a symmetric matrix. What it's strange for me is that I do unit testing for the construction for smaller sizes and I obtain real symmetric matrices, as expected. So, maybe when growing in size it gets asymmetric for some reason? Are there any more test I can do to check this? El mar., 13 nov. 2018 a las 14:52, Jose E. Roman () escribi?: > As Matt points out, the probable cause is that your matrix is not exactly > Hermitian. In our implementation of Lanczos we assume symmetry, so if > ||A-A^*|| is non-negligible then some information is being discarded, which > may confuse Lanczos giving some eigenpair as converged when it is not. > > Why doesn't this happen with the basic Lanczos solver? I don't know, > probably due to the restart scheme. In symmetric Krylov-Schur, restart is > done with eigenvectors of the projected problem (thick-restart Lanczos), > while non-symmetric Krylov-Schur uses Schur vectors. In the basic Lanczos, > restart is done building a new initial vector explicitly, which is harmless > in this case. > > Suggestion: either use non-Hermitian Krylov-Schur or symmetrize your > matrix as (A+A^*)/2 (or better when contructing it, note that transposing a > parallel matrix with 1024 processes requires a lot of communication). > > Jose > > > > El 13 nov 2018, a las 13:56, Matthew Knepley > escribi?: > > > > On Tue, Nov 13, 2018 at 7:48 AM Ale Foggia via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > Thanks Jose for the answer. I tried it an it worked! I send the output > of -eps_view again, just in case: > > > > EPS Object: 1024 MPI processes > > type: krylovschur > > 50% of basis vectors kept after restart > > using the locking variant > > problem type: non-symmetric eigenvalue problem > > selected portion of the spectrum: smallest real parts > > number of eigenvalues (nev): 1 > > number of column vectors (ncv): 16 > > maximum dimension of projected problem (mpd): 16 > > maximum number of iterations: 291700777 > > tolerance: 1e-09 > > convergence test: relative to the eigenvalue > > BV Object: 1024 MPI processes > > type: svec > > 17 columns of global length 2333606220 > > vector orthogonalization method: classical Gram-Schmidt > > orthogonalization refinement: if needed (eta: 0.7071) > > block orthogonalization method: GS > > doing matmult as a single matrix-matrix product > > DS Object: 1024 MPI processes > > type: nhep > > parallel operation mode: REDUNDANT > > ST Object: 1024 MPI processes > > type: shift > > shift: 0. > > number of matrices: 1 > > > > k ||Ax-kx||/||kx|| > > ----------------- ------------------ > > -15.048025 1.85112e-10 > > -15.047159 3.13104e-10 > > > > Iterations performed 18 > > > > Why does treating it as non-Hermitian help the convergence? Why doesn't > this happen with Lanczos? I'm lost :/ > > > > Are you sure your matrix is Hermitian? Lanczos will work for > non-Hermitian things (I believe). > > > > Thanks, > > > > Matt > > > > > > Ale > > > > El mar., 13 nov. 2018 a las 12:34, Jose E. Roman () > escribi?: > > This is really strange. We cannot say what is going on, everything seems > fine. > > Could you try solving the problem as non-Hermitian to see what happens? > Just run with -eps_non_hermitian. Depending on the result, we can suggest > other things to try. > > Jose > > > > > > > El 13 nov 2018, a las 10:58, Ale Foggia via petsc-users < > petsc-users at mcs.anl.gov> escribi?: > > > > > > Hello, > > > > > > I'm using SLEPc to get the smallest real eigenvalue > (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). The linear size of > the matrices I'm solving is around 10**9 elements and they are sparse. I've > asked a few questions before regarding the same problem setting and you > suggested me to use Krylov-Schur (because I was using Lanczos). I tried KS > and up to a certain matrix size the convergence (relative to the > eigenvalue) is good, it's around 10**-9, like with Lanczos, but when I > increase the size I start getting the eigenvalue with only 3 correct > digits. I've used the options: -eps_tol 1e-9 -eps_mpd 100 (16 was the > default), but the only thing I got is one more eigenvalue with the same big > error, and the iterations performed were only 2. Why didn't it do more in > order to reach the convergence? Should I set other parameters? I don't know > how to work out this problem, can you help me with this please? I send the > -eps_view output and the eigenvalues with its errors: > > > > > > EPS Object: 2048 MPI processes > > > type: krylovschur > > > 50% of basis vectors kept after restart > > > using the locking variant > > > problem type: symmetric eigenvalue problem > > > selected portion of the spectrum: smallest real parts > > > number of eigenvalues (nev): 1 > > > number of column vectors (ncv): 101 > > > maximum dimension of projected problem (mpd): 100 > > > maximum number of iterations: 46210024 > > > tolerance: 1e-09 > > > convergence test: relative to the eigenvalue > > > BV Object: 2048 MPI processes > > > type: svec > > > 102 columns of global length 2333606220 > > > vector orthogonalization method: classical Gram-Schmidt > > > orthogonalization refinement: if needed (eta: 0.7071) > > > block orthogonalization method: GS > > > doing matmult as a single matrix-matrix product > > > DS Object: 2048 MPI processes > > > type: hep > > > parallel operation mode: REDUNDANT > > > solving the problem with: Implicit QR method (_steqr) > > > ST Object: 2048 MPI processes > > > type: shift > > > shift: 0. > > > number of matrices: 1 > > > > > > k ||Ax-kx||/||kx|| > > > ----------------- ------------------ > > > -15.093051 0.00323917 (with KS) > > > -15.087320 0.00265215 (with KS) > > > -15.048025 8.67204e-09 (with Lanczos) > > > Iterations performed 2 > > > > > > Ale > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Nov 13 12:00:25 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 13 Nov 2018 13:00:25 -0500 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence In-Reply-To: References: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> Message-ID: On Tue, Nov 13, 2018 at 11:14 AM Ale Foggia wrote: > Thanks for the answers. I've checked with PETSc function MatIsSymmetric > (mpiaij does not support MatIsHermitian) for the size that was giving the > wrong eigenvalue and indeed it is not symmetric, for a tolerance of 1e-14. > I also wanted to check for smaller sizes, for which the eigenvalue was > correct, but I get a "Bus error, possibly illegal memory access" when the > program arrives to that function call. Any idea why it works for one case > and not for the other? I'm using PETSc 3.9.3 and SLEPc 3.9.2. > Looking at the failure in the debugger would really help us, for example with a stack trace. Thanks, Matt > The matrix has to be symmetric at all times, by construction and because > the physics problem I'm dealing with implies a symmetric matrix. What it's > strange for me is that I do unit testing for the construction for smaller > sizes and I obtain real symmetric matrices, as expected. So, maybe when > growing in size it gets asymmetric for some reason? Are there any more test > I can do to check this? > > El mar., 13 nov. 2018 a las 14:52, Jose E. Roman () > escribi?: > >> As Matt points out, the probable cause is that your matrix is not exactly >> Hermitian. In our implementation of Lanczos we assume symmetry, so if >> ||A-A^*|| is non-negligible then some information is being discarded, which >> may confuse Lanczos giving some eigenpair as converged when it is not. >> >> Why doesn't this happen with the basic Lanczos solver? I don't know, >> probably due to the restart scheme. In symmetric Krylov-Schur, restart is >> done with eigenvectors of the projected problem (thick-restart Lanczos), >> while non-symmetric Krylov-Schur uses Schur vectors. In the basic Lanczos, >> restart is done building a new initial vector explicitly, which is harmless >> in this case. >> >> Suggestion: either use non-Hermitian Krylov-Schur or symmetrize your >> matrix as (A+A^*)/2 (or better when contructing it, note that transposing a >> parallel matrix with 1024 processes requires a lot of communication). >> >> Jose >> >> >> > El 13 nov 2018, a las 13:56, Matthew Knepley >> escribi?: >> > >> > On Tue, Nov 13, 2018 at 7:48 AM Ale Foggia via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> > Thanks Jose for the answer. I tried it an it worked! I send the output >> of -eps_view again, just in case: >> > >> > EPS Object: 1024 MPI processes >> > type: krylovschur >> > 50% of basis vectors kept after restart >> > using the locking variant >> > problem type: non-symmetric eigenvalue problem >> > selected portion of the spectrum: smallest real parts >> > number of eigenvalues (nev): 1 >> > number of column vectors (ncv): 16 >> > maximum dimension of projected problem (mpd): 16 >> > maximum number of iterations: 291700777 >> > tolerance: 1e-09 >> > convergence test: relative to the eigenvalue >> > BV Object: 1024 MPI processes >> > type: svec >> > 17 columns of global length 2333606220 >> > vector orthogonalization method: classical Gram-Schmidt >> > orthogonalization refinement: if needed (eta: 0.7071) >> > block orthogonalization method: GS >> > doing matmult as a single matrix-matrix product >> > DS Object: 1024 MPI processes >> > type: nhep >> > parallel operation mode: REDUNDANT >> > ST Object: 1024 MPI processes >> > type: shift >> > shift: 0. >> > number of matrices: 1 >> > >> > k ||Ax-kx||/||kx|| >> > ----------------- ------------------ >> > -15.048025 1.85112e-10 >> > -15.047159 3.13104e-10 >> > >> > Iterations performed 18 >> > >> > Why does treating it as non-Hermitian help the convergence? Why doesn't >> this happen with Lanczos? I'm lost :/ >> > >> > Are you sure your matrix is Hermitian? Lanczos will work for >> non-Hermitian things (I believe). >> > >> > Thanks, >> > >> > Matt >> > >> > >> > Ale >> > >> > El mar., 13 nov. 2018 a las 12:34, Jose E. Roman () >> escribi?: >> > This is really strange. We cannot say what is going on, everything >> seems fine. >> > Could you try solving the problem as non-Hermitian to see what happens? >> Just run with -eps_non_hermitian. Depending on the result, we can suggest >> other things to try. >> > Jose >> > >> > >> > > El 13 nov 2018, a las 10:58, Ale Foggia via petsc-users < >> petsc-users at mcs.anl.gov> escribi?: >> > > >> > > Hello, >> > > >> > > I'm using SLEPc to get the smallest real eigenvalue >> (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). The linear size of >> the matrices I'm solving is around 10**9 elements and they are sparse. I've >> asked a few questions before regarding the same problem setting and you >> suggested me to use Krylov-Schur (because I was using Lanczos). I tried KS >> and up to a certain matrix size the convergence (relative to the >> eigenvalue) is good, it's around 10**-9, like with Lanczos, but when I >> increase the size I start getting the eigenvalue with only 3 correct >> digits. I've used the options: -eps_tol 1e-9 -eps_mpd 100 (16 was the >> default), but the only thing I got is one more eigenvalue with the same big >> error, and the iterations performed were only 2. Why didn't it do more in >> order to reach the convergence? Should I set other parameters? I don't know >> how to work out this problem, can you help me with this please? I send the >> -eps_view output and the eigenvalues with its errors: >> > > >> > > EPS Object: 2048 MPI processes >> > > type: krylovschur >> > > 50% of basis vectors kept after restart >> > > using the locking variant >> > > problem type: symmetric eigenvalue problem >> > > selected portion of the spectrum: smallest real parts >> > > number of eigenvalues (nev): 1 >> > > number of column vectors (ncv): 101 >> > > maximum dimension of projected problem (mpd): 100 >> > > maximum number of iterations: 46210024 >> > > tolerance: 1e-09 >> > > convergence test: relative to the eigenvalue >> > > BV Object: 2048 MPI processes >> > > type: svec >> > > 102 columns of global length 2333606220 >> > > vector orthogonalization method: classical Gram-Schmidt >> > > orthogonalization refinement: if needed (eta: 0.7071) >> > > block orthogonalization method: GS >> > > doing matmult as a single matrix-matrix product >> > > DS Object: 2048 MPI processes >> > > type: hep >> > > parallel operation mode: REDUNDANT >> > > solving the problem with: Implicit QR method (_steqr) >> > > ST Object: 2048 MPI processes >> > > type: shift >> > > shift: 0. >> > > number of matrices: 1 >> > > >> > > k ||Ax-kx||/||kx|| >> > > ----------------- ------------------ >> > > -15.093051 0.00323917 (with KS) >> > > -15.087320 0.00265215 (with KS) >> > > -15.048025 8.67204e-09 (with Lanczos) >> > > Iterations performed 2 >> > > >> > > Ale >> > >> > >> > >> > -- >> > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > -- Norbert Wiener >> > >> > https://www.cse.buffalo.edu/~knepley/ >> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Tue Nov 13 12:17:24 2018 From: jroman at dsic.upv.es (Jose E. Roman) Date: Tue, 13 Nov 2018 19:17:24 +0100 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence In-Reply-To: References: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> Message-ID: <44298504-2FB3-4CC1-94D5-9021915E295E@dsic.upv.es> One thing you can do is use a symmetric matrix format: -mat_type sbaij In this way, your matrix will always be symmetric because only the upper triangular part is stored. The drawback is that efficiency will likely decrease. To check symmetry, one (possibly bad) way is to take a bunch of random vectors X and check that X'*A*X is symmetric. This can be easily done with SLEPc's BVMatProject, see test9.c under $SLEPC_DIR/src/sys/classes/bv/examples/tests Jose > El 13 nov 2018, a las 19:00, Matthew Knepley escribi?: > > On Tue, Nov 13, 2018 at 11:14 AM Ale Foggia wrote: > Thanks for the answers. I've checked with PETSc function MatIsSymmetric (mpiaij does not support MatIsHermitian) for the size that was giving the wrong eigenvalue and indeed it is not symmetric, for a tolerance of 1e-14. I also wanted to check for smaller sizes, for which the eigenvalue was correct, but I get a "Bus error, possibly illegal memory access" when the program arrives to that function call. Any idea why it works for one case and not for the other? I'm using PETSc 3.9.3 and SLEPc 3.9.2. > > Looking at the failure in the debugger would really help us, for example with a stack trace. > > Thanks, > > Matt > > The matrix has to be symmetric at all times, by construction and because the physics problem I'm dealing with implies a symmetric matrix. What it's strange for me is that I do unit testing for the construction for smaller sizes and I obtain real symmetric matrices, as expected. So, maybe when growing in size it gets asymmetric for some reason? Are there any more test I can do to check this? > > El mar., 13 nov. 2018 a las 14:52, Jose E. Roman () escribi?: > As Matt points out, the probable cause is that your matrix is not exactly Hermitian. In our implementation of Lanczos we assume symmetry, so if ||A-A^*|| is non-negligible then some information is being discarded, which may confuse Lanczos giving some eigenpair as converged when it is not. > > Why doesn't this happen with the basic Lanczos solver? I don't know, probably due to the restart scheme. In symmetric Krylov-Schur, restart is done with eigenvectors of the projected problem (thick-restart Lanczos), while non-symmetric Krylov-Schur uses Schur vectors. In the basic Lanczos, restart is done building a new initial vector explicitly, which is harmless in this case. > > Suggestion: either use non-Hermitian Krylov-Schur or symmetrize your matrix as (A+A^*)/2 (or better when contructing it, note that transposing a parallel matrix with 1024 processes requires a lot of communication). > > Jose > > > > El 13 nov 2018, a las 13:56, Matthew Knepley escribi?: > > > > On Tue, Nov 13, 2018 at 7:48 AM Ale Foggia via petsc-users wrote: > > Thanks Jose for the answer. I tried it an it worked! I send the output of -eps_view again, just in case: > > > > EPS Object: 1024 MPI processes > > type: krylovschur > > 50% of basis vectors kept after restart > > using the locking variant > > problem type: non-symmetric eigenvalue problem > > selected portion of the spectrum: smallest real parts > > number of eigenvalues (nev): 1 > > number of column vectors (ncv): 16 > > maximum dimension of projected problem (mpd): 16 > > maximum number of iterations: 291700777 > > tolerance: 1e-09 > > convergence test: relative to the eigenvalue > > BV Object: 1024 MPI processes > > type: svec > > 17 columns of global length 2333606220 > > vector orthogonalization method: classical Gram-Schmidt > > orthogonalization refinement: if needed (eta: 0.7071) > > block orthogonalization method: GS > > doing matmult as a single matrix-matrix product > > DS Object: 1024 MPI processes > > type: nhep > > parallel operation mode: REDUNDANT > > ST Object: 1024 MPI processes > > type: shift > > shift: 0. > > number of matrices: 1 > > > > k ||Ax-kx||/||kx|| > > ----------------- ------------------ > > -15.048025 1.85112e-10 > > -15.047159 3.13104e-10 > > > > Iterations performed 18 > > > > Why does treating it as non-Hermitian help the convergence? Why doesn't this happen with Lanczos? I'm lost :/ > > > > Are you sure your matrix is Hermitian? Lanczos will work for non-Hermitian things (I believe). > > > > Thanks, > > > > Matt > > > > > > Ale > > > > El mar., 13 nov. 2018 a las 12:34, Jose E. Roman () escribi?: > > This is really strange. We cannot say what is going on, everything seems fine. > > Could you try solving the problem as non-Hermitian to see what happens? Just run with -eps_non_hermitian. Depending on the result, we can suggest other things to try. > > Jose > > > > > > > El 13 nov 2018, a las 10:58, Ale Foggia via petsc-users escribi?: > > > > > > Hello, > > > > > > I'm using SLEPc to get the smallest real eigenvalue (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). The linear size of the matrices I'm solving is around 10**9 elements and they are sparse. I've asked a few questions before regarding the same problem setting and you suggested me to use Krylov-Schur (because I was using Lanczos). I tried KS and up to a certain matrix size the convergence (relative to the eigenvalue) is good, it's around 10**-9, like with Lanczos, but when I increase the size I start getting the eigenvalue with only 3 correct digits. I've used the options: -eps_tol 1e-9 -eps_mpd 100 (16 was the default), but the only thing I got is one more eigenvalue with the same big error, and the iterations performed were only 2. Why didn't it do more in order to reach the convergence? Should I set other parameters? I don't know how to work out this problem, can you help me with this please? I send the -eps_view output and the eigenvalues with its errors: > > > > > > EPS Object: 2048 MPI processes > > > type: krylovschur > > > 50% of basis vectors kept after restart > > > using the locking variant > > > problem type: symmetric eigenvalue problem > > > selected portion of the spectrum: smallest real parts > > > number of eigenvalues (nev): 1 > > > number of column vectors (ncv): 101 > > > maximum dimension of projected problem (mpd): 100 > > > maximum number of iterations: 46210024 > > > tolerance: 1e-09 > > > convergence test: relative to the eigenvalue > > > BV Object: 2048 MPI processes > > > type: svec > > > 102 columns of global length 2333606220 > > > vector orthogonalization method: classical Gram-Schmidt > > > orthogonalization refinement: if needed (eta: 0.7071) > > > block orthogonalization method: GS > > > doing matmult as a single matrix-matrix product > > > DS Object: 2048 MPI processes > > > type: hep > > > parallel operation mode: REDUNDANT > > > solving the problem with: Implicit QR method (_steqr) > > > ST Object: 2048 MPI processes > > > type: shift > > > shift: 0. > > > number of matrices: 1 > > > > > > k ||Ax-kx||/||kx|| > > > ----------------- ------------------ > > > -15.093051 0.00323917 (with KS) > > > -15.087320 0.00265215 (with KS) > > > -15.048025 8.67204e-09 (with Lanczos) > > > Iterations performed 2 > > > > > > Ale > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From sajidsyed2021 at u.northwestern.edu Tue Nov 13 14:54:38 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Tue, 13 Nov 2018 14:54:38 -0600 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: <87in11inru.fsf@jedbrown.org> References: <87in11inru.fsf@jedbrown.org> Message-ID: I'm still confused and have the following questions : 1) Suppose as in the case above a DM object (created using DMCreate1D) is used to created a matrix A using DMCreateMatrix, how does one convert the indices obtained from DMDAGetElements to the row and column indices for the matrix A ? Is there a function telling me exactly which sub-matrix each rank is storing ? 2) From the ex6.c example above, it looks like e corresponds to row index and nen is the number of columns and i is the column index (which runs from 0 to nel) when addressing the matrix elements via MatSetValuesLocal. Is this correct? 3) How would the indices obtained from DMDAGetElements correspond to the indices for a vector created using DMCreateGlobalVector ? Thank You, Sajid Ali Applied Physics Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Nov 13 15:03:29 2018 From: jed at jedbrown.org (Jed Brown) Date: Tue, 13 Nov 2018 14:03:29 -0700 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: References: <87in11inru.fsf@jedbrown.org> Message-ID: <87sh04heqm.fsf@jedbrown.org> Sajid Ali writes: > I'm still confused and have the following questions : > > 1) Suppose as in the case above a DM object (created using DMCreate1D) is > used to created a matrix A using DMCreateMatrix, how does one convert the > indices obtained from DMDAGetElements to the row and column indices for the > matrix A ? Is there a function telling me exactly which sub-matrix each > rank is storing ? Submatrix? > 2) From the ex6.c example above, it looks like e corresponds to row index > and nen is the number of columns and i is the column index (which runs from > 0 to nel) when addressing the matrix elements via MatSetValuesLocal. Is > this correct? Yes, that's how local indices are used. It's also referenced in the man page. > 3) How would the indices obtained from DMDAGetElements correspond to the > indices for a vector created using DMCreateGlobalVector ? They don't; you want a local vector. From heeho.park at gmail.com Tue Nov 13 17:46:43 2018 From: heeho.park at gmail.com (HeeHo Park) Date: Tue, 13 Nov 2018 17:46:43 -0600 Subject: [petsc-users] Can't figure out why I get inf solutions. In-Reply-To: References: Message-ID: I figured out the issue for this problem. Please close the thread. Thank you. On Mon, Nov 12, 2018 at 5:27 PM HeeHo Park wrote: > Since I can't send binary files, I hardcoded the values on main.py. You > only need main.py file to run the code. > > Thanks, > > On Mon, Nov 12, 2018 at 5:17 PM HeeHo Park wrote: > >> Hi PETSc, >> >> These Python looks complicated but it really isn't. please run main.py >> >> I am writing up a script that would read petsc binary files extracted >> from one of PFLOTRAN newton iteration step and solve with petsc4py to >> investigate linear solvers. >> >> But I am having trouble solving a matrix that converges PFLOTRAN, let >> alone running a trimmed size (15x15) matrix of A. You'll see in the >> script. >> >> I don't think there is any problem with loading sparse.csr_matrix onto >> PETSc and same with array as I tested with very simple matrix which it >> solves. >> >> Can you figure out why I'm getting inf solutions? >> >> If I use the sparse matrix package to solve it, it solves it very quickly. >> >> x = sla.spsolve(A,b) >> >> Thanks, >> >> -- >> HeeHo Daniel Park >> > > > -- > HeeHo Daniel Park > -- HeeHo Daniel Park -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Nov 13 18:02:47 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Wed, 14 Nov 2018 00:02:47 +0000 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: <87sh04heqm.fsf@jedbrown.org> References: <87in11inru.fsf@jedbrown.org> <87sh04heqm.fsf@jedbrown.org> Message-ID: <8D3D22BC-EBBC-4BFD-BA74-EB5A78D35027@anl.gov> > On Nov 13, 2018, at 3:03 PM, Jed Brown via petsc-users wrote: > > Sajid Ali writes: > >> I'm still confused and have the following questions : >> >> 1) Suppose as in the case above a DM object (created using DMCreate1D) is >> used to created a matrix A using DMCreateMatrix, how does one convert the >> indices obtained from DMDAGetElements to the row and column indices for the >> matrix A ? Is there a function telling me exactly which sub-matrix each >> rank is storing ? > > Submatrix? > >> 2) From the ex6.c example above, it looks like e corresponds to row index >> and nen is the number of columns and i is the column index (which runs from >> 0 to nel) when addressing the matrix elements via MatSetValuesLocal. Is >> this correct? > > Yes, that's how local indices are used. It's also referenced in the man > page. > >> 3) How would the indices obtained from DMDAGetElements correspond to the >> indices for a vector created using DMCreateGlobalVector ? > > They don't; you want a local vector. Similar to the MatSetValuesLocal() there is a VecSetValuesLocal() that can be used with a vector obtained with DMCreateGlobalVector() to assemble the load. Barry From ivan.voznyuk.work at gmail.com Thu Nov 15 03:52:37 2018 From: ivan.voznyuk.work at gmail.com (Ivan Voznyuk) Date: Thu, 15 Nov 2018 10:52:37 +0100 Subject: [petsc-users] petsc4py help with parallel execution Message-ID: Dear PETSC community, I have a question regarding the parallel execution of petsc4py. I have a simple code (here attached simple_code.py) which solves a system of linear equations Ax=b using petsc4py. To execute it, I use the command python3 simple_code.py which yields a sequential performance. With a colleague of my, we launched this code on his computer, and this time the execution was in parallel. Although, he used the same command python3 simple_code.py (without mpirun, neither mpiexec). My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv In order to parallelize it, I have already tried: - use 2 different PCs - use Ubuntu 16.04, 18.04 - use different architectures (arch-linux2-c-debug, linux-gnu-c-debug, etc) - ofc use different configurations (my present config can be found in make.log that I attached here) - mpi from mpich, openmpi Nothing worked. Do you have any ideas? Thanks and have a good day, Ivan -- Ivan VOZNYUK PhD in Computational Electromagnetics -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: simple_code.py Type: text/x-python Size: 1080 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: text/x-log Size: 95169 bytes Desc: not available URL: From t.appel17 at imperial.ac.uk Thu Nov 15 04:48:04 2018 From: t.appel17 at imperial.ac.uk (Appel, Thibaut) Date: Thu, 15 Nov 2018 10:48:04 +0000 Subject: [petsc-users] On unknown ordering Message-ID: <3BDDA21B-2493-4706-8F25-738E83C8FF8C@imperial.ac.uk> Good morning, I would like to ask about the importance of the initial choice of ordering the unknowns when feeding a matrix to PETSc. I have a regular grid, using high-order finite differences and I simply divide rows of the matrix with PetscSplitOwnership using vertex major, natural ordering for the parallelism (not using DMDA) My understanding is that when using LU-MUMPS, this does not matter because either serial or parallel analysis is performed and all the rows are reordered ?optimally? before the LU factorization. Quality of reordering might suffer from parallel analysis. But if I use the default block Jacobi with ILU with one block per processor, the initial ordering seems to have an influence because some tightly coupled degrees of freedom might lay on different processes and the ILU becomes less powerful. You can change the ordering on each block but this won?t necessarily make things better. Are my observations accurate? Is there a recommended ordering type for a block Jacobi approach in my case? Could I expect natural improvements in fill-in or better GMRES robustness opting for parallelism offered by DMDA? Thank you, Thibaut From knepley at gmail.com Thu Nov 15 04:54:30 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 15 Nov 2018 05:54:30 -0500 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear PETSC community, > > I have a question regarding the parallel execution of petsc4py. > > I have a simple code (here attached simple_code.py) which solves a system > of linear equations Ax=b using petsc4py. To execute it, I use the command > python3 simple_code.py which yields a sequential performance. With a > colleague of my, we launched this code on his computer, and this time the > execution was in parallel. Although, he used the same command python3 > simple_code.py (without mpirun, neither mpiexec). > > I am not sure what you mean. To run MPI programs in parallel, you need a launcher like mpiexec or mpirun. There are Python programs (like nemesis) that use the launcher API directly (called PMI), but that is not part of petsc4py. Thanks, Matt > My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc 3.10.2, > PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv > > In order to parallelize it, I have already tried: > - use 2 different PCs > - use Ubuntu 16.04, 18.04 > - use different architectures (arch-linux2-c-debug, linux-gnu-c-debug, etc) > - ofc use different configurations (my present config can be found in > make.log that I attached here) > - mpi from mpich, openmpi > > Nothing worked. > > Do you have any ideas? > > Thanks and have a good day, > Ivan > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 15 04:56:25 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 15 Nov 2018 05:56:25 -0500 Subject: [petsc-users] On unknown ordering In-Reply-To: <3BDDA21B-2493-4706-8F25-738E83C8FF8C@imperial.ac.uk> References: <3BDDA21B-2493-4706-8F25-738E83C8FF8C@imperial.ac.uk> Message-ID: On Thu, Nov 15, 2018 at 5:49 AM Appel, Thibaut via petsc-users < petsc-users at mcs.anl.gov> wrote: > Good morning, > > I would like to ask about the importance of the initial choice of ordering > the unknowns when feeding a matrix to PETSc. > > I have a regular grid, using high-order finite differences and I simply > divide rows of the matrix with PetscSplitOwnership using vertex major, > natural ordering for the parallelism (not using DMDA) > > My understanding is that when using LU-MUMPS, this does not matter because > either serial or parallel analysis is performed and all the rows are > reordered ?optimally? before the LU factorization. Quality of reordering > might suffer from parallel analysis. > > But if I use the default block Jacobi with ILU with one block per > processor, the initial ordering seems to have an influence because some > tightly coupled degrees of freedom might lay on different processes and the > ILU becomes less powerful. You can change the ordering on each block but > this won?t necessarily make things better. > Yes. > Are my observations accurate? Is there a recommended ordering type for a > block Jacobi approach in my case? Could I expect natural improvements in > fill-in or better GMRES robustness opting for parallelism offered by DMDA? > For fill-in, I think yes. The impact of ordering on Krylov methods is complicated, so I have no idea here. Thanks, Matt > Thank you, > > Thibaut > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivan.voznyuk.work at gmail.com Thu Nov 15 07:07:05 2018 From: ivan.voznyuk.work at gmail.com (Ivan Voznyuk) Date: Thu, 15 Nov 2018 14:07:05 +0100 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: Hi Matthew, Thanks for your reply! Let me precise what I mean by defining few questions: 1. In order to obtain a parallel execution of simple_code.py, do I need to go with mpiexec python3 simple_code.py, or I can just launch python3 simple_code.py? 2. This simple_code.py consists of 2 parts: a) preparation of matrix b) solving the system of linear equations with PETSc. If I launch mpirun (or mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically obtain 8 matrices and 8 systems to solve. However, I need to prepare only one matrix, but launch this code in parallel on 8 processors. In fact, here attached you will find a similar code (scipy_code.py) with only one difference: the system of linear equations is solved with scipy. So when I solve it, I can clearly see that the solution is obtained in a parallel way. However, I do not use the command mpirun (or mpiexec). I just go with python3 scipy_code.py. In this case, the first part (creation of the sparse matrix) is not parallel, whereas the solution of system is found in a parallel way. So my question is, Do you think that it s possible to have the same behavior with PETSC? And what do I need for this? I am asking this because for my colleague it worked! It means that he launches the simple_code.py on his computer using the command python3 simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a parallel execution of the same code. Thanks for your help! Ivan On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley wrote: > On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Dear PETSC community, >> >> I have a question regarding the parallel execution of petsc4py. >> >> I have a simple code (here attached simple_code.py) which solves a system >> of linear equations Ax=b using petsc4py. To execute it, I use the command >> python3 simple_code.py which yields a sequential performance. With a >> colleague of my, we launched this code on his computer, and this time the >> execution was in parallel. Although, he used the same command python3 >> simple_code.py (without mpirun, neither mpiexec). >> >> I am not sure what you mean. To run MPI programs in parallel, you need a > launcher like mpiexec or mpirun. There are Python programs (like nemesis) > that use the launcher API directly (called PMI), but that is not part of > petsc4py. > > Thanks, > > Matt > >> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc >> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv >> >> In order to parallelize it, I have already tried: >> - use 2 different PCs >> - use Ubuntu 16.04, 18.04 >> - use different architectures (arch-linux2-c-debug, linux-gnu-c-debug, >> etc) >> - ofc use different configurations (my present config can be found in >> make.log that I attached here) >> - mpi from mpich, openmpi >> >> Nothing worked. >> >> Do you have any ideas? >> >> Thanks and have a good day, >> Ivan >> >> -- >> Ivan VOZNYUK >> PhD in Computational Electromagnetics >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- Ivan VOZNYUK PhD in Computational Electromagnetics +33 (0)6.95.87.04.55 My webpage My LinkedIn -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: scipy_code.py Type: text/x-python Size: 601 bytes Desc: not available URL: From knepley at gmail.com Thu Nov 15 07:39:30 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 15 Nov 2018 08:39:30 -0500 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk wrote: > Hi Matthew, > Thanks for your reply! > > Let me precise what I mean by defining few questions: > > 1. In order to obtain a parallel execution of simple_code.py, do I need to > go with mpiexec python3 simple_code.py, or I can just launch python3 > simple_code.py? > mpiexec -n 2 python3 simple_code.py > 2. This simple_code.py consists of 2 parts: a) preparation of matrix b) > solving the system of linear equations with PETSc. If I launch mpirun (or > mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically > obtain 8 matrices and 8 systems to solve. However, I need to prepare only > one matrix, but launch this code in parallel on 8 processors. > When you create the Mat object, you give it a communicator (here PETSC_COMM_WORLD). That allows us to distribute the data. This is all covered extensively in the manual and the online tutorials, as well as the example code. > In fact, here attached you will find a similar code (scipy_code.py) with > only one difference: the system of linear equations is solved with scipy. > So when I solve it, I can clearly see that the solution is obtained in a > parallel way. However, I do not use the command mpirun (or mpiexec). I just > go with python3 scipy_code.py. > Why do you think its running in parallel? Thanks, Matt > In this case, the first part (creation of the sparse matrix) is not > parallel, whereas the solution of system is found in a parallel way. > So my question is, Do you think that it s possible to have the same > behavior with PETSC? And what do I need for this? > > I am asking this because for my colleague it worked! It means that he > launches the simple_code.py on his computer using the command python3 > simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a > parallel execution of the same code. > > Thanks for your help! > Ivan > > > On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley > wrote: > >> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Dear PETSC community, >>> >>> I have a question regarding the parallel execution of petsc4py. >>> >>> I have a simple code (here attached simple_code.py) which solves a >>> system of linear equations Ax=b using petsc4py. To execute it, I use the >>> command python3 simple_code.py which yields a sequential performance. With >>> a colleague of my, we launched this code on his computer, and this time the >>> execution was in parallel. Although, he used the same command python3 >>> simple_code.py (without mpirun, neither mpiexec). >>> >>> I am not sure what you mean. To run MPI programs in parallel, you need a >> launcher like mpiexec or mpirun. There are Python programs (like nemesis) >> that use the launcher API directly (called PMI), but that is not part of >> petsc4py. >> >> Thanks, >> >> Matt >> >>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc >>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv >>> >>> In order to parallelize it, I have already tried: >>> - use 2 different PCs >>> - use Ubuntu 16.04, 18.04 >>> - use different architectures (arch-linux2-c-debug, linux-gnu-c-debug, >>> etc) >>> - ofc use different configurations (my present config can be found in >>> make.log that I attached here) >>> - mpi from mpich, openmpi >>> >>> Nothing worked. >>> >>> Do you have any ideas? >>> >>> Thanks and have a good day, >>> Ivan >>> >>> -- >>> Ivan VOZNYUK >>> PhD in Computational Electromagnetics >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > +33 (0)6.95.87.04.55 > My webpage > My LinkedIn > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From niko.karin at gmail.com Thu Nov 15 10:50:46 2018 From: niko.karin at gmail.com (Karin&NiKo) Date: Thu, 15 Nov 2018 17:50:46 +0100 Subject: [petsc-users] GAMG Parallel Performance Message-ID: Dear PETSc team, I am solving a linear transient dynamic problem, based on a discretization with finite elements. To do that, I am using FGMRES with GAMG as a preconditioner. I consider here 10 time steps. The problem has round to 118e6 dof and I am running on 1000, 1500 and 2000 procs. So I have something like 100e3, 78e3 and 50e3 dof/proc. I notice that the performance deteriorates when I increase the number of processes. You can find as attached file the log_view of the execution and the detailled definition of the KSP. Is the problem too small to run on that number of processes or is there something wrong with my use of GAMG? I thank you in advance for your help, Nicolas -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- Unknown Name on a arch-linux2-c-opt-mpi-ml-hypre named eocn0117 with 1000 processors, by B07947 Thu Nov 15 16:14:46 2018 Using Petsc Release Version 3.8.2, Nov, 09, 2017 Max Max/Min Avg Total Time (sec): 1.661e+02 1.00034 1.661e+02 Objects: 1.401e+03 1.00143 1.399e+03 Flop: 7.695e+10 1.13672 7.354e+10 7.354e+13 Flop/sec: 4.633e+08 1.13672 4.428e+08 4.428e+11 MPI Messages: 3.697e+05 12.46258 1.179e+05 1.179e+08 MPI Message Lengths: 8.786e+08 3.98485 4.086e+03 4.817e+11 MPI Reductions: 2.635e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.6608e+02 100.0% 7.3541e+13 100.0% 1.178e+08 99.9% 4.081e+03 99.9% 2.603e+03 98.8% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage MatMult 7342 1.0 4.4956e+01 1.4 4.09e+10 1.2 9.6e+07 4.3e+03 0.0e+00 23 53 81 86 0 23 53 81 86 0 859939 MatMultAdd 1130 1.0 3.4048e+00 2.3 1.55e+09 1.1 8.4e+06 8.2e+02 0.0e+00 2 2 7 1 0 2 2 7 1 0 434274 MatMultTranspose 1130 1.0 4.7555e+00 3.8 1.55e+09 1.1 8.4e+06 8.2e+02 0.0e+00 1 2 7 1 0 1 2 7 1 0 310924 MatSolve 226 0.0 6.8927e-04 0.0 6.24e+04 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 90 MatSOR 6835 1.0 3.6061e+01 1.4 2.85e+10 1.1 0.0e+00 0.0e+00 0.0e+00 20 37 0 0 0 20 37 0 0 0 760198 MatLUFactorSym 1 1.0 1.0800e-0390.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 8.0395e-04421.5 1.09e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1 MatScale 15 1.0 1.7925e-02 1.8 9.12e+06 1.1 6.6e+04 1.1e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 485856 MatResidual 1130 1.0 6.3576e+00 1.5 5.31e+09 1.2 1.5e+07 3.7e+03 0.0e+00 3 7 13 11 0 3 7 13 11 0 781728 MatAssemblyBegin 112 1.0 9.9765e-01 3.0 0.00e+00 0.0 2.1e+05 7.8e+04 7.4e+01 0 0 0 3 3 0 0 0 3 3 0 MatAssemblyEnd 112 1.0 6.8845e-01 1.1 0.00e+00 0.0 8.3e+05 3.4e+02 2.6e+02 0 0 1 0 10 0 0 1 0 10 0 MatGetRow 582170 1.0 8.5022e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 1 0.0 2.0885e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 6 1.0 3.7804e-02 1.0 0.00e+00 0.0 5.6e+04 2.8e+03 1.0e+02 0 0 0 0 4 0 0 0 0 4 0 MatGetOrdering 1 0.0 4.4608e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 5 1.0 3.2871e-02 1.1 0.00e+00 0.0 1.2e+06 4.9e+02 5.2e+01 0 0 1 0 2 0 0 1 0 2 0 MatZeroEntries 5 1.0 6.6769e-03 1.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 90 1.3 8.9249e-0216.6 0.00e+00 0.0 0.0e+00 0.0e+00 7.0e+01 0 0 0 0 3 0 0 0 0 3 0 MatAXPY 5 1.0 6.4984e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 5 1.0 6.8333e-01 1.0 1.41e+08 1.2 3.7e+05 1.0e+04 8.2e+01 0 0 0 1 3 0 0 0 1 3 193093 MatMatMultSym 5 1.0 4.8541e-01 1.0 0.00e+00 0.0 3.0e+05 7.8e+03 7.0e+01 0 0 0 0 3 0 0 0 0 3 0 MatMatMultNum 5 1.0 1.9432e-01 1.0 1.41e+08 1.2 6.6e+04 2.2e+04 1.0e+01 0 0 0 0 0 0 0 0 0 0 679018 MatPtAP 5 1.0 4.2329e+00 1.0 1.54e+09 1.5 8.3e+05 4.3e+04 8.7e+01 3 2 1 7 3 3 2 1 7 3 292103 MatPtAPSymbolic 5 1.0 2.7832e+00 1.0 0.00e+00 0.0 3.5e+05 5.6e+04 3.7e+01 2 0 0 4 1 2 0 0 4 1 0 MatPtAPNumeric 5 1.0 1.4511e+00 1.0 1.54e+09 1.5 4.8e+05 3.3e+04 5.0e+01 1 2 0 3 2 1 2 0 3 2 852080 MatTrnMatMult 1 1.0 1.5337e+00 1.0 5.87e+07 1.3 6.9e+04 8.1e+04 1.9e+01 1 0 0 1 1 1 0 0 1 1 36505 MatTrnMatMultSym 1 1.0 9.2151e-01 1.0 0.00e+00 0.0 5.7e+04 3.4e+04 1.7e+01 1 0 0 0 1 1 0 0 0 1 0 MatTrnMatMultNum 1 1.0 6.1297e-01 1.0 5.87e+07 1.3 1.1e+04 3.2e+05 2.0e+00 0 0 0 1 0 0 0 0 1 0 91341 MatGetLocalMat 17 1.0 5.4432e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 15 1.0 7.0758e-02 2.1 0.00e+00 0.0 4.6e+05 4.2e+04 0.0e+00 0 0 0 4 0 0 0 0 4 0 0 VecMDot 329 1.0 6.2030e+0013.7 6.68e+08 1.0 0.0e+00 0.0e+00 3.3e+02 1 1 0 0 12 1 1 0 0 13 106230 VecNorm 595 1.0 1.1655e+00 8.5 1.21e+08 1.0 0.0e+00 0.0e+00 6.0e+02 0 0 0 0 23 0 0 0 0 23 102761 VecScale 349 1.0 6.6033e-02 4.6 3.13e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 467735 VecCopy 1386 1.0 1.0624e-01 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 4392 1.0 8.6035e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 246 1.0 4.8357e-02 1.4 5.69e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1160750 VecAYPX 9276 1.0 4.4571e-01 1.4 3.11e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 687917 VecAXPBYCZ 4520 1.0 2.8744e-01 1.4 5.66e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1939847 VecMAXPY 575 1.0 8.4132e-01 1.5 1.36e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1600021 VecAssemblyBegin 185 1.0 6.6342e-02 1.3 0.00e+00 0.0 2.3e+04 2.2e+04 5.5e+02 0 0 0 0 21 0 0 0 0 21 0 VecAssemblyEnd 185 1.0 4.2391e-04 4.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 55 1.0 3.7224e-03 1.5 1.38e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 364534 VecScatterBegin 9786 1.0 8.6765e-01 5.5 0.00e+00 0.0 1.1e+08 3.8e+03 0.0e+00 0 0 97 90 0 0 0 97 90 0 0 VecScatterEnd 9786 1.0 1.9699e+01 9.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 VecSetRandom 5 1.0 4.3778e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 113 1.0 9.7592e-02 3.3 9.34e+06 1.0 0.0e+00 0.0e+00 1.1e+02 0 0 0 0 4 0 0 0 0 4 94297 KSPGMRESOrthog 326 1.0 6.4559e+00 9.1 1.33e+09 1.0 0.0e+00 0.0e+00 3.3e+02 1 2 0 0 12 1 2 0 0 13 203262 KSPSetUp 18 1.0 1.4065e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.4e+01 0 0 0 0 1 0 0 0 0 1 0 KSPSolve 10 1.0 7.9545e+01 1.0 7.50e+10 1.1 1.1e+08 3.8e+03 8.1e+02 48 98 95 89 31 48 98 95 89 31 903224 PCGAMGGraph_AGG 5 1.0 1.2315e+00 1.0 2.25e+06 1.2 3.3e+05 4.2e+02 1.3e+02 1 0 0 0 5 1 0 0 0 5 1759 PCGAMGCoarse_AGG 5 1.0 1.5847e+00 1.0 5.87e+07 1.3 1.3e+06 5.2e+03 8.7e+01 1 0 1 1 3 1 0 1 1 3 35331 PCGAMGProl_AGG 5 1.0 3.5152e-01 1.0 0.00e+00 0.0 2.3e+06 1.5e+03 9.0e+02 0 0 2 1 34 0 0 2 1 35 0 PCGAMGPOpt_AGG 5 1.0 1.0543e+00 1.0 4.17e+08 1.2 1.0e+06 6.1e+03 2.4e+02 1 1 1 1 9 1 1 1 1 9 372220 GAMG: createProl 5 1.0 4.2217e+00 1.0 4.78e+08 1.2 5.0e+06 3.3e+03 1.4e+03 3 1 4 3 52 3 1 4 3 52 106734 Graph 10 1.0 1.2300e+00 1.0 2.25e+06 1.2 3.3e+05 4.2e+02 1.3e+02 1 0 0 0 5 1 0 0 0 5 1761 MIS/Agg 5 1.0 3.2935e-02 1.1 0.00e+00 0.0 1.2e+06 4.9e+02 5.2e+01 0 0 1 0 2 0 0 1 0 2 0 SA: col data 5 1.0 1.3732e-01 1.0 0.00e+00 0.0 2.2e+06 1.2e+03 8.4e+02 0 0 2 1 32 0 0 2 1 32 0 SA: frmProl0 5 1.0 2.0841e-01 1.0 0.00e+00 0.0 9.5e+04 7.1e+03 5.0e+01 0 0 0 0 2 0 0 0 0 2 0 SA: smooth 5 1.0 7.6907e-01 1.0 1.48e+08 1.2 3.7e+05 1.0e+04 1.0e+02 0 0 0 1 4 0 0 0 1 4 180072 GAMG: partLevel 5 1.0 4.2824e+00 1.0 1.54e+09 1.5 8.9e+05 4.0e+04 2.5e+02 3 2 1 7 9 3 2 1 7 9 288729 repartition 3 1.0 2.1951e-03 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 Invert-Sort 3 1.0 2.9290e-03 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 0 Move A 3 1.0 2.8378e-02 1.1 0.00e+00 0.0 3.0e+04 5.2e+03 5.4e+01 0 0 0 0 2 0 0 0 0 2 0 Move P 3 1.0 1.5999e-02 1.2 0.00e+00 0.0 2.6e+04 4.0e+01 5.4e+01 0 0 0 0 2 0 0 0 0 2 0 PCSetUp 2 1.0 8.5208e+00 1.0 2.01e+09 1.4 5.8e+06 8.9e+03 1.6e+03 5 2 5 11 62 5 2 5 11 63 197991 PCSetUpOnBlocks 226 1.0 1.7779e-0310.4 1.09e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1 PCApply 226 1.0 6.9594e+01 1.1 6.40e+10 1.1 1.1e+08 3.3e+03 1.0e+02 41 83 90 72 4 41 83 90 72 4 878121 SFSetGraph 5 1.0 5.3883e-0556.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFBcastBegin 62 1.0 9.9101e-03 1.7 0.00e+00 0.0 1.2e+06 4.9e+02 0.0e+00 0 0 1 0 0 0 0 1 0 0 0 SFBcastEnd 62 1.0 6.8467e-03 6.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSided 5 1.0 7.4060e-03 2.8 0.00e+00 0.0 3.3e+04 4.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Matrix 154 154 347782424 0. Matrix Coarsen 5 5 3140 0. Matrix Null Space 1 1 688 0. Vector 1035 1035 582369520 0. Vector Scatter 36 36 39744 0. Index Set 112 112 484084 0. Krylov Solver 18 18 330336 0. Preconditioner 13 13 12868 0. PetscRandom 10 10 6380 0. Star Forest Graph 5 5 4280 0. Viewer 12 11 9152 0. ======================================================================================================================== 0 KSP unpreconditioned resid norm 3.738834777485e+08 true resid norm 3.738834777485e+08 ||r(i)||/||b|| 1.000000000000e+00 1 KSP unpreconditioned resid norm 1.256707592053e+08 true resid norm 1.256707592053e+08 ||r(i)||/||b|| 3.361227940911e-01 2 KSP unpreconditioned resid norm 1.824938621520e+07 true resid norm 1.824938621520e+07 ||r(i)||/||b|| 4.881035750789e-02 3 KSP unpreconditioned resid norm 6.102002718084e+06 true resid norm 6.102002718084e+06 ||r(i)||/||b|| 1.632060008329e-02 4 KSP unpreconditioned resid norm 2.562432902883e+06 true resid norm 2.562432902883e+06 ||r(i)||/||b|| 6.853560147439e-03 5 KSP unpreconditioned resid norm 1.188336046012e+06 true resid norm 1.188336046012e+06 ||r(i)||/||b|| 3.178359346520e-03 6 KSP unpreconditioned resid norm 5.326022866065e+05 true resid norm 5.326022866065e+05 ||r(i)||/||b|| 1.424514102131e-03 7 KSP unpreconditioned resid norm 2.433972087119e+05 true resid norm 2.433972087119e+05 ||r(i)||/||b|| 6.509974984122e-04 8 KSP unpreconditioned resid norm 1.095996827533e+05 true resid norm 1.095996827533e+05 ||r(i)||/||b|| 2.931386094225e-04 9 KSP unpreconditioned resid norm 4.986951871355e+04 true resid norm 4.986951871355e+04 ||r(i)||/||b|| 1.333825153597e-04 10 KSP unpreconditioned resid norm 2.330078182947e+04 true resid norm 2.330078182946e+04 ||r(i)||/||b|| 6.232097221779e-05 11 KSP unpreconditioned resid norm 1.084965391397e+04 true resid norm 1.084965391396e+04 ||r(i)||/||b|| 2.901881083191e-05 12 KSP unpreconditioned resid norm 5.108480961660e+03 true resid norm 5.108480961647e+03 ||r(i)||/||b|| 1.366329689776e-05 13 KSP unpreconditioned resid norm 2.450752492671e+03 true resid norm 2.450752492670e+03 ||r(i)||/||b|| 6.554856361741e-06 14 KSP unpreconditioned resid norm 1.181086403619e+03 true resid norm 1.181086403614e+03 ||r(i)||/||b|| 3.158969234817e-06 15 KSP unpreconditioned resid norm 5.606721134498e+02 true resid norm 5.606721134433e+02 ||r(i)||/||b|| 1.499590505629e-06 16 KSP unpreconditioned resid norm 2.700319247455e+02 true resid norm 2.700319247344e+02 ||r(i)||/||b|| 7.222355113430e-07 17 KSP unpreconditioned resid norm 1.314293551958e+02 true resid norm 1.314293551859e+02 ||r(i)||/||b|| 3.515249081809e-07 18 KSP unpreconditioned resid norm 6.357572858020e+01 true resid norm 6.357572858253e+01 ||r(i)||/||b|| 1.700415567047e-07 19 KSP unpreconditioned resid norm 3.077536939056e+01 true resid norm 3.077536939188e+01 ||r(i)||/||b|| 8.231272902779e-08 20 KSP unpreconditioned resid norm 1.504910881547e+01 true resid norm 1.504910882709e+01 ||r(i)||/||b|| 4.025079930707e-08 21 KSP unpreconditioned resid norm 7.400345249992e+00 true resid norm 7.400345259132e+00 ||r(i)||/||b|| 1.979318611161e-08 22 KSP unpreconditioned resid norm 3.607811417234e+00 true resid norm 3.607811420482e+00 ||r(i)||/||b|| 9.649560986776e-09 Linear solve converged due to CONVERGED_RTOL iterations 22 KSP Object: 1000 MPI processes type: fgmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-08, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1000 MPI processes type: gamg type is MULTIPLICATIVE, levels=6 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 1 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1000 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1000 MPI processes type: bjacobi number of blocks = 1000 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=12, cols=12, bs=6 package used to perform factorization: petsc total: nonzeros=144, allocated nonzeros=144 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=12, cols=12, bs=6 total: nonzeros=144, allocated nonzeros=144 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1000 MPI processes type: mpiaij rows=12, cols=12, bs=6 total: nonzeros=144, allocated nonzeros=144 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 3 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0999997, max = 1.1 eigenvalues estimate via gmres min 0.0078618, max 0.999997 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1000 MPI processes type: mpiaij rows=288, cols=288, bs=6 total: nonzeros=78408, allocated nonzeros=78408 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 86 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.139457, max = 1.53403 eigenvalues estimate via gmres min 0.077969, max 1.39457 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1000 MPI processes type: mpiaij rows=10254, cols=10254, bs=6 total: nonzeros=12883716, allocated nonzeros=12883716 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 8 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 1000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.14493, max = 1.59423 eigenvalues estimate via gmres min 0.356008, max 1.4493 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 1000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 1000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1000 MPI processes type: mpiaij rows=332466, cols=332466, bs=6 total: nonzeros=286141284, allocated nonzeros=286141284 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation using I-node (on process 0) routines: found 88 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 1000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.175972, max = 1.93569 eigenvalues estimate via gmres min 0.145536, max 1.75972 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_4_esteig_) 1000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 1000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1000 MPI processes type: mpiaij rows=5142126, cols=5142126, bs=6 total: nonzeros=1363101804, allocated nonzeros=1363101804 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation using I-node (on process 0) routines: found 1522 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 5 ------------------------------- KSP Object: (mg_levels_5_) 1000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.234733, max = 2.58207 eigenvalues estimate via gmres min 0.061528, max 2.34733 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_5_esteig_) 1000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_5_) 1000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1000 MPI processes type: mpiaij rows=117874305, cols=117874305, bs=3 total: nonzeros=9333251991, allocated nonzeros=9333251991 total number of mallocs used during MatSetValues calls =0 has attached near null space using I-node (on process 0) routines: found 39198 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1000 MPI processes type: mpiaij rows=117874305, cols=117874305, bs=3 total: nonzeros=9333251991, allocated nonzeros=9333251991 total number of mallocs used during MatSetValues calls =0 has attached near null space using I-node (on process 0) routines: found 39198 nodes, limit used is 5 -------------- next part -------------- ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- Unknown Name on a arch-linux2-c-opt-mpi-ml-hypre named eobm0011 with 2000 processors, by B07947 Thu Nov 15 15:47:29 2018 Using Petsc Release Version 3.8.2, Nov, 09, 2017 Max Max/Min Avg Total Time (sec): 2.837e+02 1.00021 2.836e+02 Objects: 1.409e+03 1.00142 1.407e+03 Flop: 3.920e+10 1.16752 3.710e+10 7.420e+13 Flop/sec: 1.382e+08 1.16751 1.308e+08 2.616e+11 MPI Messages: 4.031e+05 10.62284 1.243e+05 2.486e+08 MPI Message Lengths: 6.348e+08 4.13328 2.721e+03 6.762e+11 MPI Reductions: 2.654e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 2.8364e+02 100.0% 7.4202e+13 100.0% 2.484e+08 99.9% 2.718e+03 99.9% 2.622e+03 98.8% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage MatMult 7470 1.0 4.7611e+01 1.9 2.11e+10 1.2 2.0e+08 2.9e+03 0.0e+00 11 53 81 86 0 11 53 81 86 0 827107 MatMultAdd 1150 1.0 3.8834e+00 3.5 8.06e+08 1.2 1.7e+07 5.7e+02 0.0e+00 1 2 7 1 0 1 2 7 1 0 388724 MatMultTranspose 1150 1.0 6.7493e+00 7.4 8.06e+08 1.2 1.7e+07 5.7e+02 0.0e+00 1 2 7 1 0 1 2 7 1 0 223663 MatSolve 230 0.0 8.3327e-04 0.0 6.35e+04 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 76 MatSOR 6955 1.0 2.9793e+01 2.8 1.41e+10 1.1 0.0e+00 0.0e+00 0.0e+00 9 37 0 0 0 9 37 0 0 0 912909 MatLUFactorSym 1 1.0 4.5509e-03561.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 3.5341e-031852.9 1.09e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 15 1.0 1.7186e-02 3.3 4.62e+06 1.2 1.4e+05 7.0e+02 0.0e+00 0 0 0 0 0 0 0 0 0 0 508009 MatResidual 1150 1.0 7.0952e+00 2.3 2.75e+09 1.2 3.1e+07 2.5e+03 0.0e+00 2 7 13 12 0 2 7 13 12 0 713964 MatAssemblyBegin 112 1.0 1.0418e+00 4.7 0.00e+00 0.0 4.3e+05 5.3e+04 7.4e+01 0 0 0 3 3 0 0 0 3 3 0 MatAssemblyEnd 112 1.0 5.9064e-01 1.1 0.00e+00 0.0 1.6e+06 2.4e+02 2.6e+02 0 0 1 0 10 0 0 1 0 10 0 MatGetRow 291670 1.0 3.9900e-02 1.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 1 0.0 4.3106e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 6 1.0 4.7464e-02 1.0 0.00e+00 0.0 7.5e+04 2.0e+03 1.0e+02 0 0 0 0 4 0 0 0 0 4 0 MatGetOrdering 1 0.0 1.0009e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 5 1.0 3.4372e-02 1.1 0.00e+00 0.0 3.0e+06 3.0e+02 5.9e+01 0 0 1 0 2 0 0 1 0 2 0 MatZeroEntries 5 1.0 5.3163e-03 3.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 90 1.3 6.1949e-01 5.2 0.00e+00 0.0 0.0e+00 0.0e+00 7.0e+01 0 0 0 0 3 0 0 0 0 3 0 MatAXPY 5 1.0 4.1116e-02 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 5 1.0 4.7434e-01 1.2 7.18e+07 1.2 7.5e+05 7.0e+03 8.2e+01 0 0 0 1 3 0 0 0 1 3 278596 MatMatMultSym 5 1.0 2.8504e-01 1.0 0.00e+00 0.0 6.2e+05 5.3e+03 7.0e+01 0 0 0 0 3 0 0 0 0 3 0 MatMatMultNum 5 1.0 1.1035e-01 1.0 7.18e+07 1.2 1.4e+05 1.5e+04 1.0e+01 0 0 0 0 0 0 0 0 0 0 1197494 MatPtAP 5 1.0 2.6336e+00 1.0 8.35e+08 1.7 1.7e+06 3.0e+04 8.7e+01 1 2 1 7 3 1 2 1 7 3 472910 MatPtAPSymbolic 5 1.0 1.6345e+00 1.0 0.00e+00 0.0 7.2e+05 3.8e+04 3.7e+01 1 0 0 4 1 1 0 0 4 1 0 MatPtAPNumeric 5 1.0 1.0015e+00 1.0 8.35e+08 1.7 9.3e+05 2.3e+04 5.0e+01 0 2 0 3 2 0 2 0 3 2 1243604 MatTrnMatMult 1 1.0 8.1209e-01 1.0 2.97e+07 1.3 1.5e+05 5.0e+04 1.9e+01 0 0 0 1 1 0 0 0 1 1 69321 MatTrnMatMultSym 1 1.0 4.7897e-01 1.0 0.00e+00 0.0 1.3e+05 2.1e+04 1.7e+01 0 0 0 0 1 0 0 0 0 1 0 MatTrnMatMultNum 1 1.0 3.3517e-01 1.0 2.97e+07 1.3 2.4e+04 2.0e+05 2.0e+00 0 0 0 1 0 0 0 0 1 0 167958 MatGetLocalMat 17 1.0 3.8855e-02 2.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 15 1.0 6.2124e-02 2.5 0.00e+00 0.0 9.5e+05 2.9e+04 0.0e+00 0 0 0 4 0 0 0 0 4 0 0 VecMDot 333 1.0 1.2028e+0113.7 3.44e+08 1.0 0.0e+00 0.0e+00 3.3e+02 1 1 0 0 13 1 1 0 0 13 56587 VecNorm 603 1.0 2.7685e+00 4.5 6.16e+07 1.0 0.0e+00 0.0e+00 6.0e+02 0 0 0 0 23 0 0 0 0 23 43942 VecScale 353 1.0 9.8841e-03 1.8 1.59e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3172544 VecCopy 1410 1.0 7.7031e-02 1.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 4468 1.0 5.7269e-02 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 250 1.0 3.3906e-02 2.1 2.89e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1683301 VecAYPX 9440 1.0 3.6537e-01 2.9 1.59e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 854103 VecAXPBYCZ 4600 1.0 2.7121e-01 3.2 2.89e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2092658 VecMAXPY 583 1.0 7.1103e-01 2.7 7.03e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1955563 VecAssemblyBegin 185 1.0 1.1164e-01 1.4 0.00e+00 0.0 4.9e+04 1.4e+04 5.5e+02 0 0 0 0 21 0 0 0 0 21 0 VecAssemblyEnd 185 1.0 3.3379e-04 3.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 55 1.0 2.9728e-03 2.9 6.90e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 456523 VecScatterBegin 9954 1.0 1.0825e+00 7.3 0.00e+00 0.0 2.4e+08 2.5e+03 0.0e+00 0 0 97 90 0 0 0 97 90 0 0 VecScatterEnd 9954 1.0 3.8453e+0111.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 VecSetRandom 5 1.0 2.1403e-03 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 113 1.0 7.4105e-02 6.0 4.68e+06 1.0 0.0e+00 0.0e+00 1.1e+02 0 0 0 0 4 0 0 0 0 4 124201 KSPGMRESOrthog 330 1.0 1.2168e+0110.7 6.86e+08 1.0 0.0e+00 0.0e+00 3.3e+02 1 2 0 0 12 1 2 0 0 13 111406 KSPSetUp 18 1.0 1.2172e-02 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 1.4e+01 0 0 0 0 1 0 0 0 0 1 0 KSPSolve 10 1.0 6.5991e+01 1.0 3.82e+10 1.2 2.4e+08 2.6e+03 8.2e+02 23 98 95 89 31 23 98 95 89 31 1098603 PCGAMGGraph_AGG 5 1.0 6.7798e-01 1.0 1.13e+06 1.2 6.8e+05 2.8e+02 1.3e+02 0 0 0 0 5 0 0 0 0 5 3197 PCGAMGCoarse_AGG 5 1.0 8.5740e-01 1.0 2.97e+07 1.3 3.3e+06 2.7e+03 9.4e+01 0 0 1 1 4 0 0 1 1 4 65658 PCGAMGProl_AGG 5 1.0 2.4710e-01 1.0 0.00e+00 0.0 4.8e+06 9.8e+02 9.0e+02 0 0 2 1 34 0 0 2 1 35 0 PCGAMGPOpt_AGG 5 1.0 7.5785e-01 1.0 2.12e+08 1.2 2.1e+06 4.1e+03 2.4e+02 0 1 1 1 9 0 1 1 1 9 518589 GAMG: createProl 5 1.0 2.5407e+00 1.0 2.43e+08 1.2 1.1e+07 2.1e+03 1.4e+03 1 1 4 3 51 1 1 4 3 52 177698 Graph 10 1.0 6.7570e-01 1.0 1.13e+06 1.2 6.8e+05 2.8e+02 1.3e+02 0 0 0 0 5 0 0 0 0 5 3208 MIS/Agg 5 1.0 3.4434e-02 1.1 0.00e+00 0.0 3.0e+06 3.0e+02 5.9e+01 0 0 1 0 2 0 0 1 0 2 0 SA: col data 5 1.0 1.3094e-01 1.0 0.00e+00 0.0 4.6e+06 8.2e+02 8.4e+02 0 0 2 1 31 0 0 2 1 32 0 SA: frmProl0 5 1.0 1.1028e-01 1.0 0.00e+00 0.0 1.9e+05 4.7e+03 5.0e+01 0 0 0 0 2 0 0 0 0 2 0 SA: smooth 5 1.0 5.2676e-01 1.2 7.53e+07 1.2 7.5e+05 7.0e+03 1.0e+02 0 0 0 1 4 0 0 0 1 4 263330 GAMG: partLevel 5 1.0 2.7087e+00 1.0 8.35e+08 1.7 1.7e+06 2.9e+04 2.5e+02 1 2 1 7 9 1 2 1 7 9 459805 repartition 3 1.0 5.6183e-03 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 Invert-Sort 3 1.0 7.8020e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 0 Move A 3 1.0 4.1104e-02 1.1 0.00e+00 0.0 3.2e+04 4.5e+03 5.4e+01 0 0 0 0 2 0 0 0 0 2 0 Move P 3 1.0 1.8200e-02 1.3 0.00e+00 0.0 4.3e+04 3.6e+01 5.4e+01 0 0 0 0 2 0 0 0 0 2 0 PCSetUp 2 1.0 5.2812e+00 1.0 1.08e+09 1.5 1.3e+07 5.7e+03 1.6e+03 2 2 5 11 62 2 2 5 11 63 321316 PCSetUpOnBlocks 230 1.0 6.2256e-0319.5 1.09e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 230 1.0 5.7271e+01 1.3 3.26e+10 1.2 2.2e+08 2.2e+03 1.0e+02 20 83 90 73 4 20 83 90 73 4 1074640 SFSetGraph 5 1.0 5.3167e-0555.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFBcastBegin 69 1.0 1.1146e-02 1.9 0.00e+00 0.0 3.0e+06 3.0e+02 0.0e+00 0 0 1 0 0 0 0 1 0 0 0 SFBcastEnd 69 1.0 7.9596e-03 7.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSided 5 1.0 7.5631e-03 2.8 0.00e+00 0.0 6.9e+04 4.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Matrix 154 154 176666644 0. Matrix Coarsen 5 5 3140 0. Matrix Null Space 1 1 688 0. Vector 1043 1043 297405224 0. Vector Scatter 36 36 39456 0. Index Set 112 112 395240 0. Krylov Solver 18 18 330336 0. Preconditioner 13 13 12868 0. PetscRandom 10 10 6380 0. Star Forest Graph 5 5 4280 0. Viewer 12 11 9152 0. ======================================================================================================================== 0 KSP unpreconditioned resid norm 3.738834778097e+08 true resid norm 3.738834778097e+08 ||r(i)||/||b|| 1.000000000000e+00 1 KSP unpreconditioned resid norm 1.256561415764e+08 true resid norm 1.256561415764e+08 ||r(i)||/||b|| 3.360836972859e-01 2 KSP unpreconditioned resid norm 1.843932942229e+07 true resid norm 1.843932942229e+07 ||r(i)||/||b|| 4.931838531703e-02 3 KSP unpreconditioned resid norm 6.189553415818e+06 true resid norm 6.189553415818e+06 ||r(i)||/||b|| 1.655476581120e-02 4 KSP unpreconditioned resid norm 2.614928212473e+06 true resid norm 2.614928212473e+06 ||r(i)||/||b|| 6.993965680944e-03 5 KSP unpreconditioned resid norm 1.208975553355e+06 true resid norm 1.208975553355e+06 ||r(i)||/||b|| 3.233562393388e-03 6 KSP unpreconditioned resid norm 5.481792905733e+05 true resid norm 5.481792905733e+05 ||r(i)||/||b|| 1.466176825423e-03 7 KSP unpreconditioned resid norm 2.526854282559e+05 true resid norm 2.526854282559e+05 ||r(i)||/||b|| 6.758400497828e-04 8 KSP unpreconditioned resid norm 1.150052500229e+05 true resid norm 1.150052500229e+05 ||r(i)||/||b|| 3.075965022488e-04 9 KSP unpreconditioned resid norm 5.289416146528e+04 true resid norm 5.289416146528e+04 ||r(i)||/||b|| 1.414723158540e-04 10 KSP unpreconditioned resid norm 2.495584369428e+04 true resid norm 2.495584369427e+04 ||r(i)||/||b|| 6.674765047246e-05 11 KSP unpreconditioned resid norm 1.184780633606e+04 true resid norm 1.184780633605e+04 ||r(i)||/||b|| 3.168849932994e-05 12 KSP unpreconditioned resid norm 5.709557885707e+03 true resid norm 5.709557885717e+03 ||r(i)||/||b|| 1.527095532321e-05 13 KSP unpreconditioned resid norm 2.811037623050e+03 true resid norm 2.811037623058e+03 ||r(i)||/||b|| 7.518485811476e-06 14 KSP unpreconditioned resid norm 1.399589249024e+03 true resid norm 1.399589249031e+03 ||r(i)||/||b|| 3.743383519460e-06 15 KSP unpreconditioned resid norm 6.919705622362e+02 true resid norm 6.919705622376e+02 ||r(i)||/||b|| 1.850765287333e-06 16 KSP unpreconditioned resid norm 3.469221128804e+02 true resid norm 3.469221128823e+02 ||r(i)||/||b|| 9.278883220907e-07 17 KSP unpreconditioned resid norm 1.747835577077e+02 true resid norm 1.747835577094e+02 ||r(i)||/||b|| 4.674813627318e-07 18 KSP unpreconditioned resid norm 8.648881836541e+01 true resid norm 8.648881835829e+01 ||r(i)||/||b|| 2.313255960519e-07 19 KSP unpreconditioned resid norm 4.247581916935e+01 true resid norm 4.247581916507e+01 ||r(i)||/||b|| 1.136071040472e-07 20 KSP unpreconditioned resid norm 2.086023330347e+01 true resid norm 2.086023330200e+01 ||r(i)||/||b|| 5.579340767933e-08 21 KSP unpreconditioned resid norm 1.023525173739e+01 true resid norm 1.023525174086e+01 ||r(i)||/||b|| 2.737551228746e-08 22 KSP unpreconditioned resid norm 4.963414450847e+00 true resid norm 4.963414447514e+00 ||r(i)||/||b|| 1.327529763174e-08 23 KSP unpreconditioned resid norm 2.415620601642e+00 true resid norm 2.415620604831e+00 ||r(i)||/||b|| 6.460891556327e-09 Linear solve converged due to CONVERGED_RTOL iterations 23 KSP Object: 2000 MPI processes type: fgmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-08, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 2000 MPI processes type: gamg type is MULTIPLICATIVE, levels=6 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 1 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 2000 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 2000 MPI processes type: bjacobi number of blocks = 2000 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=12, cols=12, bs=6 package used to perform factorization: petsc total: nonzeros=144, allocated nonzeros=144 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=12, cols=12, bs=6 total: nonzeros=144, allocated nonzeros=144 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 2000 MPI processes type: mpiaij rows=12, cols=12, bs=6 total: nonzeros=144, allocated nonzeros=144 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 3 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 2000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0999937, max = 1.09993 eigenvalues estimate via gmres min 0.075342, max 0.999937 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 2000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 2000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 2000 MPI processes type: mpiaij rows=318, cols=318, bs=6 total: nonzeros=90828, allocated nonzeros=90828 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 87 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 2000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.130639, max = 1.43703 eigenvalues estimate via gmres min 0.077106, max 1.30639 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 2000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 2000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 2000 MPI processes type: mpiaij rows=9870, cols=9870, bs=6 total: nonzeros=11941884, allocated nonzeros=11941884 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 9 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 2000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.151779, max = 1.66957 eigenvalues estimate via gmres min 0.352485, max 1.51779 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 2000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 2000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 2000 MPI processes type: mpiaij rows=334476, cols=334476, bs=6 total: nonzeros=292009536, allocated nonzeros=292009536 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation using I-node (on process 0) routines: found 50 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 2000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.181248, max = 1.99372 eigenvalues estimate via gmres min 0.141976, max 1.81248 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_4_esteig_) 2000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 2000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 2000 MPI processes type: mpiaij rows=5160228, cols=5160228, bs=6 total: nonzeros=1375082208, allocated nonzeros=1375082208 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation using I-node (on process 0) routines: found 792 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 5 ------------------------------- KSP Object: (mg_levels_5_) 2000 MPI processes type: chebyshev eigenvalue estimates used: min = 0.23761, max = 2.61371 eigenvalues estimate via gmres min 0.0632228, max 2.3761 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_5_esteig_) 2000 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_5_) 2000 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 2000 MPI processes type: mpiaij rows=117874305, cols=117874305, bs=3 total: nonzeros=9333251991, allocated nonzeros=9333251991 total number of mallocs used during MatSetValues calls =0 has attached near null space using I-node (on process 0) routines: found 19690 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 2000 MPI processes type: mpiaij rows=117874305, cols=117874305, bs=3 total: nonzeros=9333251991, allocated nonzeros=9333251991 total number of mallocs used during MatSetValues calls =0 has attached near null space using I-node (on process 0) routines: found 19690 nodes, limit used is 5 -------------- next part -------------- ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- Unknown Name on a arch-linux2-c-opt-mpi-ml-hypre named eocn0055 with 1500 processors, by B07947 Thu Nov 15 15:55:02 2018 Using Petsc Release Version 3.8.2, Nov, 09, 2017 Max Max/Min Avg Total Time (sec): 2.296e+02 1.00007 2.296e+02 Objects: 1.409e+03 1.00142 1.407e+03 Flop: 5.219e+10 1.14806 4.965e+10 7.447e+13 Flop/sec: 2.273e+08 1.14806 2.162e+08 3.243e+11 MPI Messages: 4.774e+05 14.16274 1.262e+05 1.893e+08 MPI Message Lengths: 7.718e+08 4.12637 3.102e+03 5.872e+11 MPI Reductions: 2.667e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 2.2961e+02 100.0% 7.4472e+13 100.0% 1.892e+08 99.9% 3.099e+03 99.9% 2.635e+03 98.8% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage MatMult 7470 1.0 6.1871e+01 1.8 2.79e+10 1.2 1.5e+08 3.3e+03 0.0e+00 18 53 81 86 0 18 53 81 86 0 636062 MatMultAdd 1150 1.0 4.4228e+00 3.1 1.07e+09 1.2 1.3e+07 6.4e+02 0.0e+00 1 2 7 1 0 1 2 7 1 0 340660 MatMultTranspose 1150 1.0 5.8074e+00 4.5 1.07e+09 1.2 1.3e+07 6.4e+02 0.0e+00 1 2 7 1 0 1 2 7 1 0 259436 MatSolve 230 0.0 7.8106e-04 0.0 6.35e+04 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 81 MatSOR 6955 1.0 4.0051e+01 2.6 1.90e+10 1.1 0.0e+00 0.0e+00 0.0e+00 15 37 0 0 0 15 37 0 0 0 686820 MatLUFactorSym 1 1.0 1.9209e-03175.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 1.7691e-03927.5 1.09e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1 MatScale 15 1.0 2.3391e-02 4.6 6.13e+06 1.1 1.0e+05 8.0e+02 0.0e+00 0 0 0 0 0 0 0 0 0 0 372687 MatResidual 1150 1.0 9.1807e+00 2.2 3.63e+09 1.2 2.4e+07 2.8e+03 0.0e+00 2 7 13 12 0 2 7 13 12 0 551315 MatAssemblyBegin 112 1.0 9.0080e-01 2.5 0.00e+00 0.0 3.2e+05 6.2e+04 7.4e+01 0 0 0 3 3 0 0 0 3 3 0 MatAssemblyEnd 112 1.0 6.8422e-01 1.1 0.00e+00 0.0 1.2e+06 2.7e+02 2.6e+02 0 0 1 0 10 0 0 1 0 10 0 MatGetRow 388852 1.0 5.5644e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 1 0.0 1.6968e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 6 1.0 6.8178e-02 1.0 0.00e+00 0.0 8.2e+04 1.8e+03 1.0e+02 0 0 0 0 4 0 0 0 0 4 0 MatGetOrdering 1 0.0 1.8709e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 5 1.0 3.8623e-02 1.1 0.00e+00 0.0 2.7e+06 2.9e+02 7.2e+01 0 0 1 0 3 0 0 1 0 3 0 MatZeroEntries 5 1.0 6.3353e-03 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 90 1.3 6.1051e-01 5.3 0.00e+00 0.0 0.0e+00 0.0e+00 7.0e+01 0 0 0 0 3 0 0 0 0 3 0 MatAXPY 5 1.0 6.4298e-02 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 5 1.0 5.4698e-01 1.0 9.46e+07 1.2 5.7e+05 8.1e+03 8.2e+01 0 0 0 1 3 0 0 0 1 3 241395 MatMatMultSym 5 1.0 3.6737e-01 1.0 0.00e+00 0.0 4.6e+05 6.1e+03 7.0e+01 0 0 0 0 3 0 0 0 0 3 0 MatMatMultNum 5 1.0 1.7525e-01 1.0 9.46e+07 1.2 1.0e+05 1.7e+04 1.0e+01 0 0 0 0 0 0 0 0 0 0 753412 MatPtAP 5 1.0 3.4278e+00 1.0 1.10e+09 1.6 1.2e+06 3.4e+04 8.7e+01 1 2 1 7 3 1 2 1 7 3 361157 MatPtAPSymbolic 5 1.0 2.2084e+00 1.0 0.00e+00 0.0 5.4e+05 4.4e+04 3.7e+01 1 0 0 4 1 1 0 0 4 1 0 MatPtAPNumeric 5 1.0 1.2233e+00 1.0 1.10e+09 1.6 7.0e+05 2.7e+04 5.0e+01 1 2 0 3 2 1 2 0 3 2 1011960 MatTrnMatMult 1 1.0 1.0668e+00 1.0 3.95e+07 1.3 1.1e+05 6.0e+04 1.9e+01 0 0 0 1 1 0 0 0 1 1 52637 MatTrnMatMultSym 1 1.0 6.2306e-01 1.0 0.00e+00 0.0 9.0e+04 2.5e+04 1.7e+01 0 0 0 0 1 0 0 0 0 1 0 MatTrnMatMultNum 1 1.0 4.4524e-01 1.0 3.95e+07 1.3 1.8e+04 2.4e+05 2.0e+00 0 0 0 1 0 0 0 0 1 0 126116 MatGetLocalMat 17 1.0 5.2980e-02 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 15 1.0 7.0928e-02 2.7 0.00e+00 0.0 7.2e+05 3.2e+04 0.0e+00 0 0 0 4 0 0 0 0 4 0 0 VecMDot 333 1.0 6.9139e+00 5.8 4.59e+08 1.0 0.0e+00 0.0e+00 3.3e+02 1 1 0 0 12 1 1 0 0 13 98444 VecNorm 603 1.0 3.4630e+00 7.2 8.20e+07 1.0 0.0e+00 0.0e+00 6.0e+02 0 0 0 0 23 0 0 0 0 23 35129 VecScale 353 1.0 5.6857e-02 5.8 2.11e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 551520 VecCopy 1410 1.0 1.0825e-01 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 4468 1.0 8.2095e-02 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 250 1.0 5.0242e-02 2.3 3.85e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1135966 VecAYPX 9440 1.0 5.1576e-01 2.6 2.11e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 605004 VecAXPBYCZ 4600 1.0 3.6595e-01 2.7 3.85e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1550752 VecMAXPY 583 1.0 1.0786e+00 3.4 9.38e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1289187 VecAssemblyBegin 185 1.0 7.6495e-02 1.2 0.00e+00 0.0 3.5e+04 1.6e+04 5.5e+02 0 0 0 0 20 0 0 0 0 21 0 VecAssemblyEnd 185 1.0 3.8767e-04 3.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 55 1.0 4.6344e-03 2.8 9.20e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 292821 VecScatterBegin 9954 1.0 1.1589e+00 7.6 0.00e+00 0.0 1.8e+08 2.9e+03 0.0e+00 0 0 97 90 0 0 0 97 90 0 0 VecScatterEnd 9954 1.0 4.8668e+0111.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 6 0 0 0 0 6 0 0 0 0 0 VecSetRandom 5 1.0 2.8229e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 113 1.0 1.3515e-01 3.0 6.23e+06 1.0 0.0e+00 0.0e+00 1.1e+02 0 0 0 0 4 0 0 0 0 4 68095 KSPGMRESOrthog 330 1.0 7.1848e+00 4.6 9.14e+08 1.0 0.0e+00 0.0e+00 3.3e+02 1 2 0 0 12 1 2 0 0 13 188679 KSPSetUp 18 1.0 1.2331e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 1.4e+01 0 0 0 0 1 0 0 0 0 1 0 KSPSolve 10 1.0 8.7217e+01 1.0 5.09e+10 1.1 1.8e+08 2.9e+03 8.2e+02 38 98 94 89 31 38 98 94 89 31 834427 PCGAMGGraph_AGG 5 1.0 8.6509e-01 1.0 1.50e+06 1.2 5.2e+05 3.2e+02 1.3e+02 0 0 0 0 5 0 0 0 0 5 2505 PCGAMGCoarse_AGG 5 1.0 1.1177e+00 1.0 3.95e+07 1.3 2.9e+06 2.7e+03 1.1e+02 0 0 2 1 4 0 0 2 1 4 50240 PCGAMGProl_AGG 5 1.0 3.2632e-01 1.0 0.00e+00 0.0 3.7e+06 1.1e+03 9.0e+02 0 0 2 1 34 0 0 2 1 34 0 PCGAMGPOpt_AGG 5 1.0 9.1948e-01 1.0 2.80e+08 1.2 1.6e+06 4.7e+03 2.4e+02 0 1 1 1 9 0 1 1 1 9 427090 GAMG: createProl 5 1.0 3.2296e+00 1.0 3.20e+08 1.2 8.7e+06 2.3e+03 1.4e+03 1 1 5 3 52 1 1 5 3 52 139652 Graph 10 1.0 8.6263e-01 1.0 1.50e+06 1.2 5.2e+05 3.2e+02 1.3e+02 0 0 0 0 5 0 0 0 0 5 2512 MIS/Agg 5 1.0 3.8683e-02 1.1 0.00e+00 0.0 2.7e+06 2.9e+02 7.2e+01 0 0 1 0 3 0 0 1 0 3 0 SA: col data 5 1.0 1.6884e-01 1.0 0.00e+00 0.0 3.5e+06 9.4e+02 8.4e+02 0 0 2 1 31 0 0 2 1 32 0 SA: frmProl0 5 1.0 1.5309e-01 1.0 0.00e+00 0.0 1.4e+05 5.5e+03 5.0e+01 0 0 0 0 2 0 0 0 0 2 0 SA: smooth 5 1.0 6.2292e-01 1.0 9.92e+07 1.2 5.7e+05 8.1e+03 1.0e+02 0 0 0 1 4 0 0 0 1 4 222484 GAMG: partLevel 5 1.0 3.5234e+00 1.0 1.10e+09 1.6 1.3e+06 3.2e+04 2.5e+02 2 2 1 7 9 2 2 1 7 9 351357 repartition 3 1.0 4.2057e-03 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 Invert-Sort 3 1.0 3.6008e-03 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 0 Move A 3 1.0 5.2288e-02 1.1 0.00e+00 0.0 4.3e+04 3.4e+03 5.4e+01 0 0 0 0 2 0 0 0 0 2 0 Move P 3 1.0 2.8588e-02 1.2 0.00e+00 0.0 3.8e+04 3.3e+01 5.4e+01 0 0 0 0 2 0 0 0 0 2 0 PCSetUp 2 1.0 6.7734e+00 1.0 1.42e+09 1.5 1.0e+07 6.2e+03 1.7e+03 3 2 5 11 62 3 2 5 11 63 249358 PCSetUpOnBlocks 230 1.0 3.0496e-03 6.5 1.09e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 230 1.0 7.5882e+01 1.1 4.35e+10 1.2 1.7e+08 2.5e+03 1.0e+02 32 83 90 73 4 32 83 90 73 4 814742 SFSetGraph 5 1.0 2.3127e-0524.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFBcastBegin 82 1.0 1.2156e-02 1.9 0.00e+00 0.0 2.7e+06 2.9e+02 0.0e+00 0 0 1 0 0 0 0 1 0 0 0 SFBcastEnd 82 1.0 9.9132e-03 9.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSided 5 1.0 7.6580e-03 2.6 0.00e+00 0.0 5.2e+04 4.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Matrix 154 154 245066064 0. Matrix Coarsen 5 5 3140 0. Matrix Null Space 1 1 688 0. Vector 1043 1043 395730096 0. Vector Scatter 36 36 39456 0. Index Set 112 112 566408 0. Krylov Solver 18 18 330336 0. Preconditioner 13 13 12868 0. PetscRandom 10 10 6380 0. Star Forest Graph 5 5 4280 0. Viewer 12 11 9152 0. ======================================================================================================================== 0 KSP unpreconditioned resid norm 3.738834778994e+08 true resid norm 3.738834778994e+08 ||r(i)||/||b|| 1.000000000000e+00 1 KSP unpreconditioned resid norm 1.279113569974e+08 true resid norm 1.279113569974e+08 ||r(i)||/||b|| 3.421155642289e-01 2 KSP unpreconditioned resid norm 1.874944207644e+07 true resid norm 1.874944207644e+07 ||r(i)||/||b|| 5.014782194118e-02 3 KSP unpreconditioned resid norm 6.305464086727e+06 true resid norm 6.305464086727e+06 ||r(i)||/||b|| 1.686478397536e-02 4 KSP unpreconditioned resid norm 2.648974672476e+06 true resid norm 2.648974672476e+06 ||r(i)||/||b|| 7.085027365634e-03 5 KSP unpreconditioned resid norm 1.239886218685e+06 true resid norm 1.239886218685e+06 ||r(i)||/||b|| 3.316236988195e-03 6 KSP unpreconditioned resid norm 5.641563718944e+05 true resid norm 5.641563718944e+05 ||r(i)||/||b|| 1.508909607517e-03 7 KSP unpreconditioned resid norm 2.606746938444e+05 true resid norm 2.606746938444e+05 ||r(i)||/||b|| 6.972083797577e-04 8 KSP unpreconditioned resid norm 1.184535518381e+05 true resid norm 1.184535518381e+05 ||r(i)||/||b|| 3.168194339682e-04 9 KSP unpreconditioned resid norm 5.392667623794e+04 true resid norm 5.392667623794e+04 ||r(i)||/||b|| 1.442339108990e-04 10 KSP unpreconditioned resid norm 2.520203694105e+04 true resid norm 2.520203694106e+04 ||r(i)||/||b|| 6.740612632217e-05 11 KSP unpreconditioned resid norm 1.185967319435e+04 true resid norm 1.185967319434e+04 ||r(i)||/||b|| 3.172023877859e-05 12 KSP unpreconditioned resid norm 5.627359926956e+03 true resid norm 5.627359926969e+03 ||r(i)||/||b|| 1.505110618577e-05 13 KSP unpreconditioned resid norm 2.702021069922e+03 true resid norm 2.702021069923e+03 ||r(i)||/||b|| 7.226906856392e-06 14 KSP unpreconditioned resid norm 1.307500233445e+03 true resid norm 1.307500233448e+03 ||r(i)||/||b|| 3.497079466561e-06 15 KSP unpreconditioned resid norm 6.250158790292e+02 true resid norm 6.250158790312e+02 ||r(i)||/||b|| 1.671686276545e-06 16 KSP unpreconditioned resid norm 3.038680168367e+02 true resid norm 3.038680168345e+02 ||r(i)||/||b|| 8.127345410977e-07 17 KSP unpreconditioned resid norm 1.504350436399e+02 true resid norm 1.504350436436e+02 ||r(i)||/||b|| 4.023580942618e-07 18 KSP unpreconditioned resid norm 7.388944694136e+01 true resid norm 7.388944694645e+01 ||r(i)||/||b|| 1.976269381081e-07 19 KSP unpreconditioned resid norm 3.596911660459e+01 true resid norm 3.596911660288e+01 ||r(i)||/||b|| 9.620408156298e-08 20 KSP unpreconditioned resid norm 1.769248937152e+01 true resid norm 1.769248936529e+01 ||r(i)||/||b|| 4.732086441662e-08 21 KSP unpreconditioned resid norm 8.746482066795e+00 true resid norm 8.746482056876e+00 ||r(i)||/||b|| 2.339360408761e-08 22 KSP unpreconditioned resid norm 4.283455600167e+00 true resid norm 4.283455596172e+00 ||r(i)||/||b|| 1.145665922506e-08 23 KSP unpreconditioned resid norm 2.096047551274e+00 true resid norm 2.096047547699e+00 ||r(i)||/||b|| 5.606151840341e-09 Linear solve converged due to CONVERGED_RTOL iterations 23 KSP Object: 1500 MPI processes type: fgmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-08, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1500 MPI processes type: gamg type is MULTIPLICATIVE, levels=6 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 1 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1500 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1500 MPI processes type: bjacobi number of blocks = 1500 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=12, cols=12, bs=6 package used to perform factorization: petsc total: nonzeros=144, allocated nonzeros=144 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=12, cols=12, bs=6 total: nonzeros=144, allocated nonzeros=144 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1500 MPI processes type: mpiaij rows=12, cols=12, bs=6 total: nonzeros=144, allocated nonzeros=144 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 3 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1500 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0999807, max = 1.09979 eigenvalues estimate via gmres min 0.310311, max 0.999807 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1500 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1500 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1500 MPI processes type: mpiaij rows=312, cols=312, bs=6 total: nonzeros=90792, allocated nonzeros=90792 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 87 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1500 MPI processes type: chebyshev eigenvalue estimates used: min = 0.128747, max = 1.41622 eigenvalues estimate via gmres min 0.191833, max 1.28747 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1500 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1500 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1500 MPI processes type: mpiaij rows=9990, cols=9990, bs=6 total: nonzeros=11862180, allocated nonzeros=11862180 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 6 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 1500 MPI processes type: chebyshev eigenvalue estimates used: min = 0.149515, max = 1.64466 eigenvalues estimate via gmres min 0.342896, max 1.49515 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 1500 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 1500 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1500 MPI processes type: mpiaij rows=333960, cols=333960, bs=6 total: nonzeros=289654416, allocated nonzeros=289654416 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation using I-node (on process 0) routines: found 39 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 1500 MPI processes type: chebyshev eigenvalue estimates used: min = 0.173537, max = 1.90891 eigenvalues estimate via gmres min 0.143849, max 1.73537 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_4_esteig_) 1500 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 1500 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1500 MPI processes type: mpiaij rows=5149116, cols=5149116, bs=6 total: nonzeros=1368332496, allocated nonzeros=1368332496 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation using I-node (on process 0) routines: found 976 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 5 ------------------------------- KSP Object: (mg_levels_5_) 1500 MPI processes type: chebyshev eigenvalue estimates used: min = 0.241719, max = 2.65891 eigenvalues estimate via gmres min 0.0638427, max 2.41719 eigenvalues estimated using gmres with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_5_esteig_) 1500 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_5_) 1500 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1500 MPI processes type: mpiaij rows=117874305, cols=117874305, bs=3 total: nonzeros=9333251991, allocated nonzeros=9333251991 total number of mallocs used during MatSetValues calls =0 has attached near null space using I-node (on process 0) routines: found 26223 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1500 MPI processes type: mpiaij rows=117874305, cols=117874305, bs=3 total: nonzeros=9333251991, allocated nonzeros=9333251991 total number of mallocs used during MatSetValues calls =0 has attached near null space using I-node (on process 0) routines: found 26223 nodes, limit used is 5 From ivan.voznyuk.work at gmail.com Thu Nov 15 10:59:29 2018 From: ivan.voznyuk.work at gmail.com (Ivan Voznyuk) Date: Thu, 15 Nov 2018 17:59:29 +0100 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: Hi Matthew, Does it mean that by using just command python3 simple_code.py (without mpiexec) you *cannot* obtain a parallel execution? It s been 5 days we are trying to understand with my colleague how he managed to do so. It means that by using simply python3 simple_code.py he gets 8 processors workiing. By the way, we wrote in his code few lines: rank = PETSc.COMM_WORLD.Get_rank() size = PETSc.COMM_WORLD.Get_size() and we got rank = 0, size = 1 However, we compilator arrives to KSP.solve(), somehow it turns on 8 processors. This problem is solved on his PC in 5-8 sec (in parallel, using *python3 simple_code.py*), on mine it takes 70-90 secs (in sequantial, but with the same command *python3 simple_code.py*) So, conclusion is that on his computer this code works in the same way as scipy: all the code is executed in sequantial mode, but when it comes to solution of system of linear equations, it runs on all available processors. All this with just running python3 my_code.py (without any mpi-smth) Is it an exception / abnormal behavior? I mean, is it something irregular that you, developers, have never seen? Thanks and have a good evening! Ivan P.S. I don't think I know the answer regarding Scipy... On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley wrote: > On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk > wrote: > >> Hi Matthew, >> Thanks for your reply! >> >> Let me precise what I mean by defining few questions: >> >> 1. In order to obtain a parallel execution of simple_code.py, do I need >> to go with mpiexec python3 simple_code.py, or I can just launch python3 >> simple_code.py? >> > > mpiexec -n 2 python3 simple_code.py > > >> 2. This simple_code.py consists of 2 parts: a) preparation of matrix b) >> solving the system of linear equations with PETSc. If I launch mpirun (or >> mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically >> obtain 8 matrices and 8 systems to solve. However, I need to prepare only >> one matrix, but launch this code in parallel on 8 processors. >> > > When you create the Mat object, you give it a communicator (here > PETSC_COMM_WORLD). That allows us to distribute the data. This is all > covered extensively in the manual and the online tutorials, as well as the > example code. > > >> In fact, here attached you will find a similar code (scipy_code.py) with >> only one difference: the system of linear equations is solved with scipy. >> So when I solve it, I can clearly see that the solution is obtained in a >> parallel way. However, I do not use the command mpirun (or mpiexec). I just >> go with python3 scipy_code.py. >> > > Why do you think its running in parallel? > > Thanks, > > Matt > > >> In this case, the first part (creation of the sparse matrix) is not >> parallel, whereas the solution of system is found in a parallel way. >> So my question is, Do you think that it s possible to have the same >> behavior with PETSC? And what do I need for this? >> >> I am asking this because for my colleague it worked! It means that he >> launches the simple_code.py on his computer using the command python3 >> simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a >> parallel execution of the same code. >> >> Thanks for your help! >> Ivan >> >> >> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley >> wrote: >> >>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> >>>> Dear PETSC community, >>>> >>>> I have a question regarding the parallel execution of petsc4py. >>>> >>>> I have a simple code (here attached simple_code.py) which solves a >>>> system of linear equations Ax=b using petsc4py. To execute it, I use the >>>> command python3 simple_code.py which yields a sequential performance. With >>>> a colleague of my, we launched this code on his computer, and this time the >>>> execution was in parallel. Although, he used the same command python3 >>>> simple_code.py (without mpirun, neither mpiexec). >>>> >>>> I am not sure what you mean. To run MPI programs in parallel, you need >>> a launcher like mpiexec or mpirun. There are Python programs (like nemesis) >>> that use the launcher API directly (called PMI), but that is not part of >>> petsc4py. >>> >>> Thanks, >>> >>> Matt >>> >>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc >>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv >>>> >>>> In order to parallelize it, I have already tried: >>>> - use 2 different PCs >>>> - use Ubuntu 16.04, 18.04 >>>> - use different architectures (arch-linux2-c-debug, linux-gnu-c-debug, >>>> etc) >>>> - ofc use different configurations (my present config can be found in >>>> make.log that I attached here) >>>> - mpi from mpich, openmpi >>>> >>>> Nothing worked. >>>> >>>> Do you have any ideas? >>>> >>>> Thanks and have a good day, >>>> Ivan >>>> >>>> -- >>>> Ivan VOZNYUK >>>> PhD in Computational Electromagnetics >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> >> >> -- >> Ivan VOZNYUK >> PhD in Computational Electromagnetics >> +33 (0)6.95.87.04.55 >> My webpage >> My LinkedIn >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- Ivan VOZNYUK PhD in Computational Electromagnetics +33 (0)6.95.87.04.55 My webpage My LinkedIn -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 15 11:07:49 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 15 Nov 2018 12:07:49 -0500 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk wrote: > Hi Matthew, > > Does it mean that by using just command python3 simple_code.py (without > mpiexec) you *cannot* obtain a parallel execution? > As I wrote before, its not impossible. You could be directly calling PMI, but I do not think you are doing that. > It s been 5 days we are trying to understand with my colleague how he > managed to do so. > It means that by using simply python3 simple_code.py he gets 8 processors > workiing. > By the way, we wrote in his code few lines: > rank = PETSc.COMM_WORLD.Get_rank() > size = PETSc.COMM_WORLD.Get_size() > and we got rank = 0, size = 1 > This is MPI telling you that you are only running on 1 processes. > However, we compilator arrives to KSP.solve(), somehow it turns on 8 > processors. > Why do you think its running on 8 processes? > This problem is solved on his PC in 5-8 sec (in parallel, using *python3 > simple_code.py*), on mine it takes 70-90 secs (in sequantial, but with > the same command *python3 simple_code.py*) > I think its much more likely that there are differences in the solver (use -ksp_view to see exactly what solver was used), then to think it is parallelism. Moreover, you would never ever ever see that much speedup on a laptop since all these computations are bandwidth limited. Thanks, Matt > So, conclusion is that on his computer this code works in the same way as > scipy: all the code is executed in sequantial mode, but when it comes to > solution of system of linear equations, it runs on all available > processors. All this with just running python3 my_code.py (without any > mpi-smth) > > Is it an exception / abnormal behavior? I mean, is it something irregular > that you, developers, have never seen? > > Thanks and have a good evening! > Ivan > > P.S. I don't think I know the answer regarding Scipy... > > > On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley wrote: > >> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk >> wrote: >> >>> Hi Matthew, >>> Thanks for your reply! >>> >>> Let me precise what I mean by defining few questions: >>> >>> 1. In order to obtain a parallel execution of simple_code.py, do I need >>> to go with mpiexec python3 simple_code.py, or I can just launch python3 >>> simple_code.py? >>> >> >> mpiexec -n 2 python3 simple_code.py >> >> >>> 2. This simple_code.py consists of 2 parts: a) preparation of matrix b) >>> solving the system of linear equations with PETSc. If I launch mpirun (or >>> mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically >>> obtain 8 matrices and 8 systems to solve. However, I need to prepare only >>> one matrix, but launch this code in parallel on 8 processors. >>> >> >> When you create the Mat object, you give it a communicator (here >> PETSC_COMM_WORLD). That allows us to distribute the data. This is all >> covered extensively in the manual and the online tutorials, as well as the >> example code. >> >> >>> In fact, here attached you will find a similar code (scipy_code.py) with >>> only one difference: the system of linear equations is solved with scipy. >>> So when I solve it, I can clearly see that the solution is obtained in a >>> parallel way. However, I do not use the command mpirun (or mpiexec). I just >>> go with python3 scipy_code.py. >>> >> >> Why do you think its running in parallel? >> >> Thanks, >> >> Matt >> >> >>> In this case, the first part (creation of the sparse matrix) is not >>> parallel, whereas the solution of system is found in a parallel way. >>> So my question is, Do you think that it s possible to have the same >>> behavior with PETSC? And what do I need for this? >>> >>> I am asking this because for my colleague it worked! It means that he >>> launches the simple_code.py on his computer using the command python3 >>> simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a >>> parallel execution of the same code. >>> >>> Thanks for your help! >>> Ivan >>> >>> >>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley >>> wrote: >>> >>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >>>> petsc-users at mcs.anl.gov> wrote: >>>> >>>>> Dear PETSC community, >>>>> >>>>> I have a question regarding the parallel execution of petsc4py. >>>>> >>>>> I have a simple code (here attached simple_code.py) which solves a >>>>> system of linear equations Ax=b using petsc4py. To execute it, I use the >>>>> command python3 simple_code.py which yields a sequential performance. With >>>>> a colleague of my, we launched this code on his computer, and this time the >>>>> execution was in parallel. Although, he used the same command python3 >>>>> simple_code.py (without mpirun, neither mpiexec). >>>>> >>>>> I am not sure what you mean. To run MPI programs in parallel, you need >>>> a launcher like mpiexec or mpirun. There are Python programs (like nemesis) >>>> that use the launcher API directly (called PMI), but that is not part of >>>> petsc4py. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc >>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv >>>>> >>>>> In order to parallelize it, I have already tried: >>>>> - use 2 different PCs >>>>> - use Ubuntu 16.04, 18.04 >>>>> - use different architectures (arch-linux2-c-debug, linux-gnu-c-debug, >>>>> etc) >>>>> - ofc use different configurations (my present config can be found in >>>>> make.log that I attached here) >>>>> - mpi from mpich, openmpi >>>>> >>>>> Nothing worked. >>>>> >>>>> Do you have any ideas? >>>>> >>>>> Thanks and have a good day, >>>>> Ivan >>>>> >>>>> -- >>>>> Ivan VOZNYUK >>>>> PhD in Computational Electromagnetics >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >>> >>> -- >>> Ivan VOZNYUK >>> PhD in Computational Electromagnetics >>> +33 (0)6.95.87.04.55 >>> My webpage >>> My LinkedIn >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > +33 (0)6.95.87.04.55 > My webpage > My LinkedIn > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 15 11:19:38 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 15 Nov 2018 12:19:38 -0500 Subject: [petsc-users] GAMG Parallel Performance In-Reply-To: References: Message-ID: On Thu, Nov 15, 2018 at 11:52 AM Karin&NiKo via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear PETSc team, > > I am solving a linear transient dynamic problem, based on a discretization > with finite elements. To do that, I am using FGMRES with GAMG as a > preconditioner. I consider here 10 time steps. > The problem has round to 118e6 dof and I am running on 1000, 1500 and 2000 > procs. So I have something like 100e3, 78e3 and 50e3 dof/proc. > I notice that the performance deteriorates when I increase the number of > processes. > You can find as attached file the log_view of the execution and the > detailled definition of the KSP. > > Is the problem too small to run on that number of processes or is there > something wrong with my use of GAMG? > I am having a hard time understanding the data. Just to be clear, I understand you to be running the exact same problem on 1000, 1500, and 2000 processes, so looking for strong speedup. The PCSetUp time actually sped up a little, which is great, and its still a small percentage (notice that your whole solve is only half the runtime). Lets just look at a big time component, MatMult, P = 1000 MatMult 7342 1.0 4.4956e+01 1.4 4.09e+10 1.2 9.6e+07 4.3e+03 0.0e+00 23 53 81 86 0 23 53 81 86 0 859939 P = 2000 MatMult 7470 1.0 4.7611e+01 1.9 2.11e+10 1.2 2.0e+08 2.9e+03 0.0e+00 11 53 81 86 0 11 53 81 86 0 827107 So there was no speedup at all. It is doing 1/2 the flops per process, but taking almost exactly the same time. This looks like your 2000 process run is on exactly the same number of nodes as your 1000 process run, but you just use more processes. Your 1000 process run was maxing out the bandwidth of those nodes, and thus 2000 runs no faster. Is this true? Otherwise, I am misunderstanding the run. Thanks, Matt > I thank you in advance for your help, > Nicolas > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Thu Nov 15 11:24:18 2018 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Thu, 15 Nov 2018 20:24:18 +0300 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: If you say your program is parallel by just looking at the output from the top command, you are probably linking against a multithreaded blas library Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < petsc-users at mcs.anl.gov> ha scritto: > On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk > wrote: > >> Hi Matthew, >> >> Does it mean that by using just command python3 simple_code.py (without >> mpiexec) you *cannot* obtain a parallel execution? >> > > As I wrote before, its not impossible. You could be directly calling PMI, > but I do not think you are doing that. > > >> It s been 5 days we are trying to understand with my colleague how he >> managed to do so. >> It means that by using simply python3 simple_code.py he gets 8 processors >> workiing. >> By the way, we wrote in his code few lines: >> rank = PETSc.COMM_WORLD.Get_rank() >> size = PETSc.COMM_WORLD.Get_size() >> and we got rank = 0, size = 1 >> > > This is MPI telling you that you are only running on 1 processes. > > >> However, we compilator arrives to KSP.solve(), somehow it turns on 8 >> processors. >> > > Why do you think its running on 8 processes? > > >> This problem is solved on his PC in 5-8 sec (in parallel, using *python3 >> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but with >> the same command *python3 simple_code.py*) >> > > I think its much more likely that there are differences in the solver (use > -ksp_view to see exactly what solver was used), then > to think it is parallelism. Moreover, you would never ever ever see that > much speedup on a laptop since all these computations > are bandwidth limited. > > Thanks, > > Matt > > >> So, conclusion is that on his computer this code works in the same way as >> scipy: all the code is executed in sequantial mode, but when it comes to >> solution of system of linear equations, it runs on all available >> processors. All this with just running python3 my_code.py (without any >> mpi-smth) >> >> Is it an exception / abnormal behavior? I mean, is it something irregular >> that you, developers, have never seen? >> >> Thanks and have a good evening! >> Ivan >> >> P.S. I don't think I know the answer regarding Scipy... >> >> >> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley >> wrote: >> >>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < >>> ivan.voznyuk.work at gmail.com> wrote: >>> >>>> Hi Matthew, >>>> Thanks for your reply! >>>> >>>> Let me precise what I mean by defining few questions: >>>> >>>> 1. In order to obtain a parallel execution of simple_code.py, do I need >>>> to go with mpiexec python3 simple_code.py, or I can just launch python3 >>>> simple_code.py? >>>> >>> >>> mpiexec -n 2 python3 simple_code.py >>> >>> >>>> 2. This simple_code.py consists of 2 parts: a) preparation of matrix b) >>>> solving the system of linear equations with PETSc. If I launch mpirun (or >>>> mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically >>>> obtain 8 matrices and 8 systems to solve. However, I need to prepare only >>>> one matrix, but launch this code in parallel on 8 processors. >>>> >>> >>> When you create the Mat object, you give it a communicator (here >>> PETSC_COMM_WORLD). That allows us to distribute the data. This is all >>> covered extensively in the manual and the online tutorials, as well as the >>> example code. >>> >>> >>>> In fact, here attached you will find a similar code (scipy_code.py) >>>> with only one difference: the system of linear equations is solved with >>>> scipy. So when I solve it, I can clearly see that the solution is obtained >>>> in a parallel way. However, I do not use the command mpirun (or mpiexec). I >>>> just go with python3 scipy_code.py. >>>> >>> >>> Why do you think its running in parallel? >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> In this case, the first part (creation of the sparse matrix) is not >>>> parallel, whereas the solution of system is found in a parallel way. >>>> So my question is, Do you think that it s possible to have the same >>>> behavior with PETSC? And what do I need for this? >>>> >>>> I am asking this because for my colleague it worked! It means that he >>>> launches the simple_code.py on his computer using the command python3 >>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a >>>> parallel execution of the same code. >>>> >>>> Thanks for your help! >>>> Ivan >>>> >>>> >>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley >>>> wrote: >>>> >>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >>>>> petsc-users at mcs.anl.gov> wrote: >>>>> >>>>>> Dear PETSC community, >>>>>> >>>>>> I have a question regarding the parallel execution of petsc4py. >>>>>> >>>>>> I have a simple code (here attached simple_code.py) which solves a >>>>>> system of linear equations Ax=b using petsc4py. To execute it, I use the >>>>>> command python3 simple_code.py which yields a sequential performance. With >>>>>> a colleague of my, we launched this code on his computer, and this time the >>>>>> execution was in parallel. Although, he used the same command python3 >>>>>> simple_code.py (without mpirun, neither mpiexec). >>>>>> >>>>>> I am not sure what you mean. To run MPI programs in parallel, you >>>>> need a launcher like mpiexec or mpirun. There are Python programs (like >>>>> nemesis) that use the launcher API directly (called PMI), but that is not >>>>> part of petsc4py. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc >>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv >>>>>> >>>>>> In order to parallelize it, I have already tried: >>>>>> - use 2 different PCs >>>>>> - use Ubuntu 16.04, 18.04 >>>>>> - use different architectures (arch-linux2-c-debug, >>>>>> linux-gnu-c-debug, etc) >>>>>> - ofc use different configurations (my present config can be found in >>>>>> make.log that I attached here) >>>>>> - mpi from mpich, openmpi >>>>>> >>>>>> Nothing worked. >>>>>> >>>>>> Do you have any ideas? >>>>>> >>>>>> Thanks and have a good day, >>>>>> Ivan >>>>>> >>>>>> -- >>>>>> Ivan VOZNYUK >>>>>> PhD in Computational Electromagnetics >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>>> -- >>>> Ivan VOZNYUK >>>> PhD in Computational Electromagnetics >>>> +33 (0)6.95.87.04.55 >>>> My webpage >>>> My LinkedIn >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> >> >> -- >> Ivan VOZNYUK >> PhD in Computational Electromagnetics >> +33 (0)6.95.87.04.55 >> My webpage >> My LinkedIn >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivan.voznyuk.work at gmail.com Thu Nov 15 11:33:45 2018 From: ivan.voznyuk.work at gmail.com (Ivan) Date: Thu, 15 Nov 2018 18:33:45 +0100 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: <51ee86a3-2848-27d8-e93a-4fae372e3a32@gmail.com> Matthew, */As I wrote before, its not impossible. You could be directly calling PMI, but I do not think you are doing that./* Could you precise what is PMI? and how can we directly use it? It might be a key to this mystery! */Why do you think its running on 8 processes?/* Well, we base our opinion on 3 points: 1) htop shows a total loading of 8 processors 2) system monitor shows the same behavior 3) Time. 8 seconds vs 70 seconds, although we have a very similar PC configs */I think its much more likely that there are differences in the solver (use -ksp_view to see exactly what solver was used), then to think it is parallelism. /* We actually use the incidental code. Or do you think that independently on this fact, and the fact that we precise in the code "ksp.getPC().setFactorSolverType('mumps')" ksp may solve the system of equations using different solver? / / */Moreover, you would never ever ever see that much speedup on a laptop since all these computations /* ** */are bandwidth limited./* I agree to this point. But I would think that taking into account that his computer is *a bit* more powerful and his code is executed in parallel, we might have an acceleration. We, for example, tested other, more physical codes. And noted acceleration x4 - x6 Thank you for your contribution, Ivan On 15/11/2018 18:07, Matthew Knepley wrote: > On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk > > wrote: > > Hi Matthew, > > Does it mean that by using just command python3 simple_code.py > (without mpiexec) you _cannot_ obtain a parallel execution? > > > As I wrote before, its not impossible. You could be directly calling > PMI, but I do not think you are doing that. > > It s been 5 days we are trying to understand with my colleague how > he managed to do so. > It means that by using simply python3 simple_code.py he gets 8 > processors workiing. > By the way, we wrote in his code few lines: > rank = PETSc.COMM_WORLD.Get_rank() > size = PETSc.COMM_WORLD.Get_size() > and we got rank = 0, size = 1 > > > This is MPI telling you that you are only running on 1 processes. > > However, we compilator arrives to KSP.solve(), somehow it turns on > 8 processors. > > > Why do you think its running on 8 processes? > > This problem is solved on his PC in 5-8 sec (in parallel, using > _python3 simple_code.py_), on mine it takes 70-90 secs (in > sequantial, but with the same command _python3 simple_code.py_) > > > I think its much more likely that there are differences in the solver > (use -ksp_view to see exactly what solver was used), then > to think it is parallelism. Moreover, you would never ever ever see > that much speedup on a laptop since all these computations > are bandwidth limited. > > ? Thanks, > > ? ? ?Matt > > So, conclusion is that on his computer this code works in the same > way as scipy: all the code is executed in sequantial mode, but > when it comes to solution of system of linear equations, it runs > on all available processors. All this with just running python3 > my_code.py (without any mpi-smth) > > Is it an exception / abnormal behavior? I mean, is it something > irregular that you, developers, have never seen? > > Thanks and have a good evening! > Ivan > > P.S. I don't think I know the answer regarding Scipy... > > > On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley > wrote: > > On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk > > wrote: > > Hi Matthew, > Thanks for your reply! > > Let me precise what I mean by defining few questions: > > 1. In order to obtain a parallel execution of > simple_code.py, do I need to go with mpiexec python3 > simple_code.py, or I can just launch python3 simple_code.py? > > > mpiexec -n 2 python3 simple_code.py > > 2. This simple_code.py consists of 2 parts: a) preparation > of matrix b) solving the system of linear equations with > PETSc. If I launch mpirun (or mpiexec) -np 8 python3 > simple_code.py, I suppose that I will basically obtain 8 > matrices and 8 systems to solve. However, I need to > prepare only one matrix, but launch this code in parallel > on 8 processors. > > > When you create the Mat object, you give it a communicator > (here PETSC_COMM_WORLD). That allows us to distribute the > data. This is all covered extensively in the manual and the > online tutorials, as well as the example code. > > In fact, here attached you will find a similar code > (scipy_code.py) with only one difference: the system of > linear equations is solved with scipy. So when I solve it, > I can clearly see that the solution is obtained in a > parallel way. However, I do not use the command mpirun (or > mpiexec). I just go with python3 scipy_code.py. > > > Why do you think its running in parallel? > > ? Thanks, > > ? ? ?Matt > > In this case, the first part (creation of the sparse > matrix) is not parallel, whereas the solution of system is > found in a parallel way. > So my question is, Do you think that it s possible to have > the same behavior with PETSC? And what do I need for this? > > I am asking this because for my colleague it worked! It > means that he launches the simple_code.py on his computer > using the command python3 simple_code.py (and not mpi-smth > python3 simple_code.py) and he obtains a parallel > execution of the same code. > > Thanks for your help! > Ivan > > > On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley > > wrote: > > On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via > petsc-users > wrote: > > Dear PETSC community, > > I have a question regarding the parallel execution > of petsc4py. > > I have a simple code (here attached > simple_code.py) which solves a system of linear > equations Ax=b using petsc4py. To execute it, I > use the command python3 simple_code.py which > yields a sequential performance. With a colleague > of my, we launched this code on his computer, and > this time the execution was in parallel. Although, > he used the same command python3 simple_code.py > (without mpirun, neither mpiexec). > > I am not sure what you mean. To run MPI programs in > parallel, you need a launcher like mpiexec or mpirun. > There are Python programs (like nemesis) that use the > launcher API directly (called PMI), but that is not > part of petsc4py. > > ? Thanks, > > ? ? ?Matt > > My configuration: Ubuntu x86_64 Ubuntu 16.04, > Intel Core i7, PETSc 3.10.2, > PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in > virtualenv > > In order to parallelize it, I have already tried: > - use 2 different PCs > - use Ubuntu 16.04, 18.04 > - use different architectures > (arch-linux2-c-debug, linux-gnu-c-debug, etc) > - ofc use different configurations (my present > config can be found in make.log that I attached here) > - mpi from mpich, openmpi > > Nothing worked. > > Do you have any ideas? > > Thanks and have a good day, > Ivan > > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > > > > -- > What most experimenters take for granted before they > begin their experiments is infinitely more interesting > than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > +33 (0)6.95.87.04.55 > My webpage > My LinkedIn > > > > -- > What most experimenters take for granted before they begin > their experiments is infinitely more interesting than any > results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > +33 (0)6.95.87.04.55 > My webpage > My LinkedIn > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Thu Nov 15 11:37:04 2018 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Thu, 15 Nov 2018 20:37:04 +0300 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: <51ee86a3-2848-27d8-e93a-4fae372e3a32@gmail.com> References: <51ee86a3-2848-27d8-e93a-4fae372e3a32@gmail.com> Message-ID: Il giorno Gio 15 Nov 2018, 20:35 Ivan via petsc-users < petsc-users at mcs.anl.gov> ha scritto: > Matthew, > *As I wrote before, its not impossible. You could be directly calling PMI, > but I do not think you are doing that.* > > Could you precise what is PMI? and how can we directly use it? It might be > a key to this mystery! > There are no mysteries. The blas is multithreaded > *Why do you think its running on 8 processes?* > > Well, we base our opinion on 3 points: > 1) htop shows a total loading of 8 processors > 2) system monitor shows the same behavior > 3) Time. 8 seconds vs 70 seconds, although we have a very similar PC > configs > *I think its much more likely that there are differences in the solver > (use -ksp_view to see exactly what solver was used), then to think it is > parallelism. * > > We actually use the incidental code. Or do you think that independently on > this fact, and the fact that we precise in the code > "ksp.getPC().setFactorSolverType('mumps')" ksp may solve the system of > equations using different solver? > > *Moreover, you would never ever ever see that much speedup on a laptop > since all these computations * > *are bandwidth limited.* > > I agree to this point. But I would think that taking into account that his > computer is *a bit* more powerful and his code is executed in parallel, we > might have an acceleration. We, for example, tested other, more physical > codes. And noted acceleration x4 - x6 > > Thank you for your contribution, > > Ivan > On 15/11/2018 18:07, Matthew Knepley wrote: > > On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk > wrote: > >> Hi Matthew, >> >> Does it mean that by using just command python3 simple_code.py (without >> mpiexec) you *cannot* obtain a parallel execution? >> > > As I wrote before, its not impossible. You could be directly calling PMI, > but I do not think you are doing that. > > >> It s been 5 days we are trying to understand with my colleague how he >> managed to do so. >> It means that by using simply python3 simple_code.py he gets 8 processors >> workiing. >> By the way, we wrote in his code few lines: >> rank = PETSc.COMM_WORLD.Get_rank() >> size = PETSc.COMM_WORLD.Get_size() >> and we got rank = 0, size = 1 >> > > This is MPI telling you that you are only running on 1 processes. > > >> However, we compilator arrives to KSP.solve(), somehow it turns on 8 >> processors. >> > > Why do you think its running on 8 processes? > > >> This problem is solved on his PC in 5-8 sec (in parallel, using *python3 >> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but with >> the same command *python3 simple_code.py*) >> > > I think its much more likely that there are differences in the solver (use > -ksp_view to see exactly what solver was used), then > to think it is parallelism. Moreover, you would never ever ever see that > much speedup on a laptop since all these computations > are bandwidth limited. > > Thanks, > > Matt > > >> So, conclusion is that on his computer this code works in the same way as >> scipy: all the code is executed in sequantial mode, but when it comes to >> solution of system of linear equations, it runs on all available >> processors. All this with just running python3 my_code.py (without any >> mpi-smth) >> >> Is it an exception / abnormal behavior? I mean, is it something irregular >> that you, developers, have never seen? >> >> Thanks and have a good evening! >> Ivan >> >> P.S. I don't think I know the answer regarding Scipy... >> >> >> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley >> wrote: >> >>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < >>> ivan.voznyuk.work at gmail.com> wrote: >>> >>>> Hi Matthew, >>>> Thanks for your reply! >>>> >>>> Let me precise what I mean by defining few questions: >>>> >>>> 1. In order to obtain a parallel execution of simple_code.py, do I need >>>> to go with mpiexec python3 simple_code.py, or I can just launch python3 >>>> simple_code.py? >>>> >>> >>> mpiexec -n 2 python3 simple_code.py >>> >>> >>>> 2. This simple_code.py consists of 2 parts: a) preparation of matrix b) >>>> solving the system of linear equations with PETSc. If I launch mpirun (or >>>> mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically >>>> obtain 8 matrices and 8 systems to solve. However, I need to prepare only >>>> one matrix, but launch this code in parallel on 8 processors. >>>> >>> >>> When you create the Mat object, you give it a communicator (here >>> PETSC_COMM_WORLD). That allows us to distribute the data. This is all >>> covered extensively in the manual and the online tutorials, as well as the >>> example code. >>> >>> >>>> In fact, here attached you will find a similar code (scipy_code.py) >>>> with only one difference: the system of linear equations is solved with >>>> scipy. So when I solve it, I can clearly see that the solution is obtained >>>> in a parallel way. However, I do not use the command mpirun (or mpiexec). I >>>> just go with python3 scipy_code.py. >>>> >>> >>> Why do you think its running in parallel? >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> In this case, the first part (creation of the sparse matrix) is not >>>> parallel, whereas the solution of system is found in a parallel way. >>>> So my question is, Do you think that it s possible to have the same >>>> behavior with PETSC? And what do I need for this? >>>> >>>> I am asking this because for my colleague it worked! It means that he >>>> launches the simple_code.py on his computer using the command python3 >>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a >>>> parallel execution of the same code. >>>> >>>> Thanks for your help! >>>> Ivan >>>> >>>> >>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley >>>> wrote: >>>> >>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >>>>> petsc-users at mcs.anl.gov> wrote: >>>>> >>>>>> Dear PETSC community, >>>>>> >>>>>> I have a question regarding the parallel execution of petsc4py. >>>>>> >>>>>> I have a simple code (here attached simple_code.py) which solves a >>>>>> system of linear equations Ax=b using petsc4py. To execute it, I use the >>>>>> command python3 simple_code.py which yields a sequential performance. With >>>>>> a colleague of my, we launched this code on his computer, and this time the >>>>>> execution was in parallel. Although, he used the same command python3 >>>>>> simple_code.py (without mpirun, neither mpiexec). >>>>>> >>>>> I am not sure what you mean. To run MPI programs in parallel, you need >>>>> a launcher like mpiexec or mpirun. There are Python programs (like nemesis) >>>>> that use the launcher API directly (called PMI), but that is not part of >>>>> petsc4py. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc >>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv >>>>>> >>>>>> In order to parallelize it, I have already tried: >>>>>> - use 2 different PCs >>>>>> - use Ubuntu 16.04, 18.04 >>>>>> - use different architectures (arch-linux2-c-debug, >>>>>> linux-gnu-c-debug, etc) >>>>>> - ofc use different configurations (my present config can be found in >>>>>> make.log that I attached here) >>>>>> - mpi from mpich, openmpi >>>>>> >>>>>> Nothing worked. >>>>>> >>>>>> Do you have any ideas? >>>>>> >>>>>> Thanks and have a good day, >>>>>> Ivan >>>>>> >>>>>> -- >>>>>> Ivan VOZNYUK >>>>>> PhD in Computational Electromagnetics >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>>> -- >>>> Ivan VOZNYUK >>>> PhD in Computational Electromagnetics >>>> +33 (0)6.95.87.04.55 >>>> My webpage >>>> My LinkedIn >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> >> >> -- >> Ivan VOZNYUK >> PhD in Computational Electromagnetics >> +33 (0)6.95.87.04.55 >> My webpage >> My LinkedIn >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivan.voznyuk.work at gmail.com Thu Nov 15 11:43:33 2018 From: ivan.voznyuk.work at gmail.com (Ivan) Date: Thu, 15 Nov 2018 18:43:33 +0100 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: Hi Stefano, In fact, yes, we look at the htop output (and the resulting computational time ofc). In our code we use MUMPS, which indeed depends on blas / lapack. So I think this might be it! I will definetely check it (I mean the difference between our MUMPS, blas, lapack). If you have an idea of how we can verify on his PC that the source of his parallelization does come from BLAS, please do not hesitate to tell me! Thanks! Ivan On 15/11/2018 18:24, Stefano Zampini wrote: > If you say your program is parallel by just looking at the output from > the top command, you are probably linking against a multithreaded blas > library > > Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users > > ha scritto: > > On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk > > > wrote: > > Hi Matthew, > > Does it mean that by using just command python3 simple_code.py > (without mpiexec) you _cannot_ obtain a parallel execution? > > > As I wrote before, its not impossible. You could be directly > calling PMI, but I do not think you are doing that. > > It s been 5 days we are trying to understand with my colleague > how he managed to do so. > It means that by using simply python3 simple_code.py he gets 8 > processors workiing. > By the way, we wrote in his code few lines: > rank = PETSc.COMM_WORLD.Get_rank() > size = PETSc.COMM_WORLD.Get_size() > and we got rank = 0, size = 1 > > > This is MPI telling you that you are only running on 1 processes. > > However, we compilator arrives to KSP.solve(), somehow it > turns on 8 processors. > > > Why do you think its running on 8 processes? > > This problem is solved on his PC in 5-8 sec (in parallel, > using _python3 simple_code.py_), on mine it takes 70-90 secs > (in sequantial, but with the same command _python3 > simple_code.py_) > > > I think its much more likely that there are differences in the > solver (use -ksp_view to see exactly what solver was used), then > to think it is parallelism. Moreover, you would never ever ever > see that much speedup on a laptop since all these computations > are bandwidth limited. > > ? Thanks, > > ? ? ?Matt > > So, conclusion is that on his computer this code works in the > same way as scipy: all the code is executed in sequantial > mode, but when it comes to solution of system of linear > equations, it runs on all available processors. All this with > just running python3 my_code.py (without any mpi-smth) > > Is it an exception / abnormal behavior? I mean, is it > something irregular that you, developers, have never seen? > > Thanks and have a good evening! > Ivan > > P.S. I don't think I know the answer regarding Scipy... > > > On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley > > wrote: > > On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk > > wrote: > > Hi Matthew, > Thanks for your reply! > > Let me precise what I mean by defining few questions: > > 1. In order to obtain a parallel execution of > simple_code.py, do I need to go with mpiexec python3 > simple_code.py, or I can just launch python3 > simple_code.py? > > > mpiexec -n 2 python3 simple_code.py > > 2. This simple_code.py consists of 2 parts: a) > preparation of matrix b) solving the system of linear > equations with PETSc. If I launch mpirun (or mpiexec) > -np 8 python3 simple_code.py, I suppose that I will > basically obtain 8 matrices and 8 systems to solve. > However, I need to prepare only one matrix, but launch > this code in parallel on 8 processors. > > > When you create the Mat object, you give it a communicator > (here PETSC_COMM_WORLD). That allows us to distribute the > data. This is all covered extensively in the manual and > the online tutorials, as well as the example code. > > In fact, here attached you will find a similar code > (scipy_code.py) with only one difference: the system > of linear equations is solved with scipy. So when I > solve it, I can clearly see that the solution is > obtained in a parallel way. However, I do not use the > command mpirun (or mpiexec). I just go with python3 > scipy_code.py. > > > Why do you think its running in parallel? > > ? Thanks, > > ? ? ?Matt > > In this case, the first part (creation of the sparse > matrix) is not parallel, whereas the solution of > system is found in a parallel way. > So my question is, Do you think that it s possible to > have the same behavior with PETSC? And what do I need > for this? > > I am asking this because for my colleague it worked! > It means that he launches the simple_code.py on his > computer using the command python3 simple_code.py (and > not mpi-smth python3 simple_code.py) and he obtains a > parallel execution of the same code. > > Thanks for your help! > Ivan > > > On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley > > wrote: > > On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via > petsc-users > wrote: > > Dear PETSC community, > > I have a question regarding the parallel > execution of petsc4py. > > I have a simple code (here attached > simple_code.py) which solves a system of > linear equations Ax=b using petsc4py. To > execute it, I use the command python3 > simple_code.py which yields a sequential > performance. With a colleague of my, we > launched this code on his computer, and this > time the execution was in parallel. Although, > he used the same command python3 > simple_code.py (without mpirun, neither mpiexec). > > I am not sure what you mean. To run MPI programs > in parallel, you need a launcher like mpiexec or > mpirun. There are Python programs (like nemesis) > that use the launcher API directly (called PMI), > but that is not part of petsc4py. > > ? Thanks, > > ? ? ?Matt > > My configuration: Ubuntu x86_64 Ubuntu 16.04, > Intel Core i7, PETSc 3.10.2, > PETSC_ARCH=arch-linux2-c-debug, petsc4py > 3.10.0 in virtualenv > > In order to parallelize it, I have already tried: > - use 2 different PCs > - use Ubuntu 16.04, 18.04 > - use different architectures > (arch-linux2-c-debug, linux-gnu-c-debug, etc) > - ofc use different configurations (my present > config can be found in make.log that I > attached here) > - mpi from mpich, openmpi > > Nothing worked. > > Do you have any ideas? > > Thanks and have a good day, > Ivan > > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > > > > -- > What most experimenters take for granted before > they begin their experiments is infinitely more > interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > +33 (0)6.95.87.04.55 > My webpage > My LinkedIn > > > > > -- > What most experimenters take for granted before they begin > their experiments is infinitely more interesting than any > results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > +33 (0)6.95.87.04.55 > My webpage > My LinkedIn > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to > which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Nov 15 12:16:15 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Thu, 15 Nov 2018 18:16:15 +0000 Subject: [petsc-users] On unknown ordering In-Reply-To: <3BDDA21B-2493-4706-8F25-738E83C8FF8C@imperial.ac.uk> References: <3BDDA21B-2493-4706-8F25-738E83C8FF8C@imperial.ac.uk> Message-ID: > On Nov 15, 2018, at 4:48 AM, Appel, Thibaut via petsc-users wrote: > > Good morning, > > I would like to ask about the importance of the initial choice of ordering the unknowns when feeding a matrix to PETSc. > > I have a regular grid, using high-order finite differences and I simply divide rows of the matrix with PetscSplitOwnership using vertex major, natural ordering for the parallelism (not using DMDA) So each process is getting a slice of the domain? To minimize communication it is best to use "square-ish" subdomains instead of slices; this is why the DMDA tries to use "square-ish" subdomains. I don't know the relationship between convergence rate and the shapes of the subdomains, it will depend on the operator and possibly "flow direction" etc. > > My understanding is that when using LU-MUMPS, this does not matter because either serial or parallel analysis is performed and all the rows are reordered ?optimally? before the LU factorization. Quality of reordering might suffer from parallel analysis. > > But if I use the default block Jacobi with ILU with one block per processor, the initial ordering seems to have an influence because some tightly coupled degrees of freedom might lay on different processes and the ILU becomes less powerful. You can change the ordering on each block but this won?t necessarily make things better. > > Are my observations accurate? Is there a recommended ordering type for a block Jacobi approach in my case? Could I expect natural improvements in fill-in or better GMRES robustness opting for parallelism offered by DMDA? You might consider using -pc_type asm (additive Schwarz method) instead of block Jacobi. This "reintroduces" some of the tight coupling that is discarded when slicing up the domain for block Jacobi. Barry > > Thank you, > > Thibaut From bsmith at mcs.anl.gov Thu Nov 15 12:56:05 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Thu, 15 Nov 2018 18:56:05 +0000 Subject: [petsc-users] GAMG Parallel Performance In-Reply-To: References: Message-ID: <62EA5AF0-B8AC-4A46-AC1E-3CEC8C4A1386@anl.gov> Something is odd about your configuration. Just consider the time for VecMAXPY which is an embarrassingly parallel operation. On 1000 MPI processes it produces Time flop rate VecMAXPY 575 1.0 8.4132e-01 1.5 1.36e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,600,021 on 1500 processes it produces VecMAXPY 583 1.0 1.0786e+00 3.4 9.38e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,289,187 that is it actually takes longer (the time goes from .84 seconds to 1.08 seconds and the flop rate from 1,600,021 down to 1,289,187) You would never expect this kind of behavior and on 2000 processes it produces VecMAXPY 583 1.0 7.1103e-01 2.7 7.03e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,955,563 so it speeds up again but not by very much. This is very mysterious and not what you would expect. I'm inclined to believe something is out of whack on your computer, are you sure all nodes on the computer are equivalent? Same processors, same clock speeds? What happens if you run the 1000 process case several times, do you get very similar numbers for VecMAXPY()? You should but I am guessing you may not. Barry Note that this performance issue doesn't really have anything to do with the preconditioner you are using. > On Nov 15, 2018, at 10:50 AM, Karin&NiKo via petsc-users wrote: > > Dear PETSc team, > > I am solving a linear transient dynamic problem, based on a discretization with finite elements. To do that, I am using FGMRES with GAMG as a preconditioner. I consider here 10 time steps. > The problem has round to 118e6 dof and I am running on 1000, 1500 and 2000 procs. So I have something like 100e3, 78e3 and 50e3 dof/proc. > I notice that the performance deteriorates when I increase the number of processes. > You can find as attached file the log_view of the execution and the detailled definition of the KSP. > > Is the problem too small to run on that number of processes or is there something wrong with my use of GAMG? > > I thank you in advance for your help, > Nicolas > From mfadams at lbl.gov Thu Nov 15 13:02:40 2018 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 15 Nov 2018 14:02:40 -0500 Subject: [petsc-users] GAMG Parallel Performance In-Reply-To: <62EA5AF0-B8AC-4A46-AC1E-3CEC8C4A1386@anl.gov> References: <62EA5AF0-B8AC-4A46-AC1E-3CEC8C4A1386@anl.gov> Message-ID: There is a lot of load imbalance in VecMAXPY also. The partitioning could be bad and if not its the machine. On Thu, Nov 15, 2018 at 1:56 PM Smith, Barry F. via petsc-users < petsc-users at mcs.anl.gov> wrote: > > Something is odd about your configuration. Just consider the time for > VecMAXPY which is an embarrassingly parallel operation. On 1000 MPI > processes it produces > > Time > > flop rate > VecMAXPY 575 1.0 8.4132e-01 1.5 1.36e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,600,021 > > on 1500 processes it produces > > VecMAXPY 583 1.0 1.0786e+00 3.4 9.38e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,289,187 > > that is it actually takes longer (the time goes from .84 seconds to 1.08 > seconds and the flop rate from 1,600,021 down to 1,289,187) You would never > expect this kind of behavior > > and on 2000 processes it produces > > VecMAXPY 583 1.0 7.1103e-01 2.7 7.03e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,955,563 > > so it speeds up again but not by very much. This is very mysterious and > not what you would expect. > > I'm inclined to believe something is out of whack on your computer, are > you sure all nodes on the computer are equivalent? Same processors, same > clock speeds? What happens if you run the 1000 process case several times, > do you get very similar numbers for VecMAXPY()? You should but I am > guessing you may not. > > Barry > > Note that this performance issue doesn't really have anything to do with > the preconditioner you are using. > > > > > > > On Nov 15, 2018, at 10:50 AM, Karin&NiKo via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Dear PETSc team, > > > > I am solving a linear transient dynamic problem, based on a > discretization with finite elements. To do that, I am using FGMRES with > GAMG as a preconditioner. I consider here 10 time steps. > > The problem has round to 118e6 dof and I am running on 1000, 1500 and > 2000 procs. So I have something like 100e3, 78e3 and 50e3 dof/proc. > > I notice that the performance deteriorates when I increase the number of > processes. > > You can find as attached file the log_view of the execution and the > detailled definition of the KSP. > > > > Is the problem too small to run on that number of processes or is there > something wrong with my use of GAMG? > > > > I thank you in advance for your help, > > Nicolas > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Nov 15 13:50:25 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Thu, 15 Nov 2018 19:50:25 +0000 Subject: [petsc-users] GAMG Parallel Performance In-Reply-To: References: <62EA5AF0-B8AC-4A46-AC1E-3CEC8C4A1386@anl.gov> Message-ID: > On Nov 15, 2018, at 1:02 PM, Mark Adams wrote: > > There is a lot of load imbalance in VecMAXPY also. The partitioning could be bad and if not its the machine. > > On Thu, Nov 15, 2018 at 1:56 PM Smith, Barry F. via petsc-users wrote: > > Something is odd about your configuration. Just consider the time for VecMAXPY which is an embarrassingly parallel operation. On 1000 MPI processes it produces > > Time flop rate > VecMAXPY 575 1.0 8.4132e-01 1.5 1.36e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,600,021 > > on 1500 processes it produces > > VecMAXPY 583 1.0 1.0786e+00 3.4 9.38e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,289,187 > > that is it actually takes longer (the time goes from .84 seconds to 1.08 seconds and the flop rate from 1,600,021 down to 1,289,187) You would never expect this kind of behavior > > and on 2000 processes it produces > > VecMAXPY 583 1.0 7.1103e-01 2.7 7.03e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,955,563 > > so it speeds up again but not by very much. This is very mysterious and not what you would expect. > > I'm inclined to believe something is out of whack on your computer, are you sure all nodes on the computer are equivalent? Same processors, same clock speeds? What happens if you run the 1000 process case several times, do you get very similar numbers for VecMAXPY()? You should but I am guessing you may not. > > Barry > > Note that this performance issue doesn't really have anything to do with the preconditioner you are using. > > > > > > > On Nov 15, 2018, at 10:50 AM, Karin&NiKo via petsc-users wrote: > > > > Dear PETSc team, > > > > I am solving a linear transient dynamic problem, based on a discretization with finite elements. To do that, I am using FGMRES with GAMG as a preconditioner. I consider here 10 time steps. > > The problem has round to 118e6 dof and I am running on 1000, 1500 and 2000 procs. So I have something like 100e3, 78e3 and 50e3 dof/proc. > > I notice that the performance deteriorates when I increase the number of processes. > > You can find as attached file the log_view of the execution and the detailled definition of the KSP. > > > > Is the problem too small to run on that number of processes or is there something wrong with my use of GAMG? > > > > I thank you in advance for your help, > > Nicolas > > > From niko.karin at gmail.com Thu Nov 15 17:24:42 2018 From: niko.karin at gmail.com (Karin&NiKo) Date: Fri, 16 Nov 2018 00:24:42 +0100 Subject: [petsc-users] GAMG Parallel Performance In-Reply-To: References: <62EA5AF0-B8AC-4A46-AC1E-3CEC8C4A1386@anl.gov> Message-ID: Ok. I will do that soon and I will let you know. Thanks again, Nicolas Le jeu. 15 nov. 2018 20:50, Smith, Barry F. a ?crit : > > > > On Nov 15, 2018, at 1:02 PM, Mark Adams wrote: > > > > There is a lot of load imbalance in VecMAXPY also. The partitioning > could be bad and if not its the machine. > > > > > > On Thu, Nov 15, 2018 at 1:56 PM Smith, Barry F. via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Something is odd about your configuration. Just consider the time > for VecMAXPY which is an embarrassingly parallel operation. On 1000 MPI > processes it produces > > > > Time > > flop rate > > VecMAXPY 575 1.0 8.4132e-01 1.5 1.36e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,600,021 > > > > on 1500 processes it produces > > > > VecMAXPY 583 1.0 1.0786e+00 3.4 9.38e+08 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,289,187 > > > > that is it actually takes longer (the time goes from .84 seconds to 1.08 > seconds and the flop rate from 1,600,021 down to 1,289,187) You would never > expect this kind of behavior > > > > and on 2000 processes it produces > > > > VecMAXPY 583 1.0 7.1103e-01 2.7 7.03e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,955,563 > > > > so it speeds up again but not by very much. This is very mysterious and > not what you would expect. > > > > I'm inclined to believe something is out of whack on your computer, > are you sure all nodes on the computer are equivalent? Same processors, > same clock speeds? What happens if you run the 1000 process case several > times, do you get very similar numbers for VecMAXPY()? You should but I am > guessing you may not. > > > > Barry > > > > Note that this performance issue doesn't really have anything to do > with the preconditioner you are using. > > > > > > > > > > > > > On Nov 15, 2018, at 10:50 AM, Karin&NiKo via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > > > Dear PETSc team, > > > > > > I am solving a linear transient dynamic problem, based on a > discretization with finite elements. To do that, I am using FGMRES with > GAMG as a preconditioner. I consider here 10 time steps. > > > The problem has round to 118e6 dof and I am running on 1000, 1500 and > 2000 procs. So I have something like 100e3, 78e3 and 50e3 dof/proc. > > > I notice that the performance deteriorates when I increase the number > of processes. > > > You can find as attached file the log_view of the execution and the > detailled definition of the KSP. > > > > > > Is the problem too small to run on that number of processes or is > there something wrong with my use of GAMG? > > > > > > I thank you in advance for your help, > > > Nicolas > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Thu Nov 15 18:25:09 2018 From: dave.mayhem23 at gmail.com (Dave May) Date: Fri, 16 Nov 2018 00:25:09 +0000 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: On Thu, 15 Nov 2018 at 17:44, Ivan via petsc-users wrote: > Hi Stefano, > > In fact, yes, we look at the htop output (and the resulting computational > time ofc). > > In our code we use MUMPS, which indeed depends on blas / lapack. So I > think this might be it! > > I will definetely check it (I mean the difference between our MUMPS, blas, > lapack). > > If you have an idea of how we can verify on his PC that the source of his > parallelization does come from BLAS, please do not hesitate to tell me! > Option 1/ * Set this environment variable export OMP_NUM_THREADS=1 * Re-run your "parallel" test. * If the performance differs (job runs slower) compared with your previous run where you inferred parallelism was being employed, you can safely assume that the parallelism observed comes from threads Option 2/ * Re-configure PETSc to use a known BLAS implementation which does not support threads * Re-compile PETSc * Re-run your parallel test * If the performance differs (job runs slower) compared with your previous run where you inferred parallelism was being employed, you can safely assume that the parallelism observed comes from threads Option 3/ * Use a PC which does not depend on BLAS at all, e.g. -pc_type jacobi -pc_type bjacobi * If the performance differs (job runs slower) compared with your previous run where you inferred parallelism was being employed, you can safely assume that the parallelism observed comes from BLAS + threads > Thanks! > > Ivan > On 15/11/2018 18:24, Stefano Zampini wrote: > > If you say your program is parallel by just looking at the output from the > top command, you are probably linking against a multithreaded blas library > > Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < > petsc-users at mcs.anl.gov> ha scritto: > >> On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk < >> ivan.voznyuk.work at gmail.com> wrote: >> >>> Hi Matthew, >>> >>> Does it mean that by using just command python3 simple_code.py (without >>> mpiexec) you *cannot* obtain a parallel execution? >>> >> >> As I wrote before, its not impossible. You could be directly calling PMI, >> but I do not think you are doing that. >> >> >>> It s been 5 days we are trying to understand with my colleague how he >>> managed to do so. >>> It means that by using simply python3 simple_code.py he gets 8 >>> processors workiing. >>> By the way, we wrote in his code few lines: >>> rank = PETSc.COMM_WORLD.Get_rank() >>> size = PETSc.COMM_WORLD.Get_size() >>> and we got rank = 0, size = 1 >>> >> >> This is MPI telling you that you are only running on 1 processes. >> >> >>> However, we compilator arrives to KSP.solve(), somehow it turns on 8 >>> processors. >>> >> >> Why do you think its running on 8 processes? >> >> >>> This problem is solved on his PC in 5-8 sec (in parallel, using *python3 >>> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but with >>> the same command *python3 simple_code.py*) >>> >> >> I think its much more likely that there are differences in the solver >> (use -ksp_view to see exactly what solver was used), then >> to think it is parallelism. Moreover, you would never ever ever see that >> much speedup on a laptop since all these computations >> are bandwidth limited. >> >> Thanks, >> >> Matt >> >> >>> So, conclusion is that on his computer this code works in the same way >>> as scipy: all the code is executed in sequantial mode, but when it comes to >>> solution of system of linear equations, it runs on all available >>> processors. All this with just running python3 my_code.py (without any >>> mpi-smth) >>> >>> Is it an exception / abnormal behavior? I mean, is it something >>> irregular that you, developers, have never seen? >>> >>> Thanks and have a good evening! >>> Ivan >>> >>> P.S. I don't think I know the answer regarding Scipy... >>> >>> >>> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley >>> wrote: >>> >>>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < >>>> ivan.voznyuk.work at gmail.com> wrote: >>>> >>>>> Hi Matthew, >>>>> Thanks for your reply! >>>>> >>>>> Let me precise what I mean by defining few questions: >>>>> >>>>> 1. In order to obtain a parallel execution of simple_code.py, do I >>>>> need to go with mpiexec python3 simple_code.py, or I can just launch >>>>> python3 simple_code.py? >>>>> >>>> >>>> mpiexec -n 2 python3 simple_code.py >>>> >>>> >>>>> 2. This simple_code.py consists of 2 parts: a) preparation of matrix >>>>> b) solving the system of linear equations with PETSc. If I launch mpirun >>>>> (or mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically >>>>> obtain 8 matrices and 8 systems to solve. However, I need to prepare only >>>>> one matrix, but launch this code in parallel on 8 processors. >>>>> >>>> >>>> When you create the Mat object, you give it a communicator (here >>>> PETSC_COMM_WORLD). That allows us to distribute the data. This is all >>>> covered extensively in the manual and the online tutorials, as well as the >>>> example code. >>>> >>>> >>>>> In fact, here attached you will find a similar code (scipy_code.py) >>>>> with only one difference: the system of linear equations is solved with >>>>> scipy. So when I solve it, I can clearly see that the solution is obtained >>>>> in a parallel way. However, I do not use the command mpirun (or mpiexec). I >>>>> just go with python3 scipy_code.py. >>>>> >>>> >>>> Why do you think its running in parallel? >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> In this case, the first part (creation of the sparse matrix) is not >>>>> parallel, whereas the solution of system is found in a parallel way. >>>>> So my question is, Do you think that it s possible to have the same >>>>> behavior with PETSC? And what do I need for this? >>>>> >>>>> I am asking this because for my colleague it worked! It means that he >>>>> launches the simple_code.py on his computer using the command python3 >>>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a >>>>> parallel execution of the same code. >>>>> >>>>> Thanks for your help! >>>>> Ivan >>>>> >>>>> >>>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >>>>>> petsc-users at mcs.anl.gov> wrote: >>>>>> >>>>>>> Dear PETSC community, >>>>>>> >>>>>>> I have a question regarding the parallel execution of petsc4py. >>>>>>> >>>>>>> I have a simple code (here attached simple_code.py) which solves a >>>>>>> system of linear equations Ax=b using petsc4py. To execute it, I use the >>>>>>> command python3 simple_code.py which yields a sequential performance. With >>>>>>> a colleague of my, we launched this code on his computer, and this time the >>>>>>> execution was in parallel. Although, he used the same command python3 >>>>>>> simple_code.py (without mpirun, neither mpiexec). >>>>>>> >>>>>> I am not sure what you mean. To run MPI programs in parallel, you >>>>>> need a launcher like mpiexec or mpirun. There are Python programs (like >>>>>> nemesis) that use the launcher API directly (called PMI), but that is not >>>>>> part of petsc4py. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc >>>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv >>>>>>> >>>>>>> In order to parallelize it, I have already tried: >>>>>>> - use 2 different PCs >>>>>>> - use Ubuntu 16.04, 18.04 >>>>>>> - use different architectures (arch-linux2-c-debug, >>>>>>> linux-gnu-c-debug, etc) >>>>>>> - ofc use different configurations (my present config can be found >>>>>>> in make.log that I attached here) >>>>>>> - mpi from mpich, openmpi >>>>>>> >>>>>>> Nothing worked. >>>>>>> >>>>>>> Do you have any ideas? >>>>>>> >>>>>>> Thanks and have a good day, >>>>>>> Ivan >>>>>>> >>>>>>> -- >>>>>>> Ivan VOZNYUK >>>>>>> PhD in Computational Electromagnetics >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> Ivan VOZNYUK >>>>> PhD in Computational Electromagnetics >>>>> +33 (0)6.95.87.04.55 >>>>> My webpage >>>>> My LinkedIn >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >>> >>> -- >>> Ivan VOZNYUK >>> PhD in Computational Electromagnetics >>> +33 (0)6.95.87.04.55 >>> My webpage >>> My LinkedIn >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amfoggia at gmail.com Fri Nov 16 02:38:41 2018 From: amfoggia at gmail.com (Ale Foggia) Date: Fri, 16 Nov 2018 09:38:41 +0100 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence In-Reply-To: <44298504-2FB3-4CC1-94D5-9021915E295E@dsic.upv.es> References: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> <44298504-2FB3-4CC1-94D5-9021915E295E@dsic.upv.es> Message-ID: > > One thing you can do is use a symmetric matrix format: -mat_type sbaij > In this way, your matrix will always be symmetric because only the upper > triangular part is stored. The drawback is that efficiency will likely > decrease. > > To check symmetry, one (possibly bad) way is to take a bunch of random > vectors X and check that X'*A*X is symmetric. This can be easily done with > SLEPc's BVMatProject, see test9.c under > $SLEPC_DIR/src/sys/classes/bv/examples/tests > > Jose > > Is it possible that the asymmetry arises because of a numerical issue? It starts happening when my numbers are bigger than 2**32. I've checked that I'm always using PetscInt and I compiled the library with "64 bit integers". It seems strange to me that it only happens after some point and that the asymmetry is bigger than 10**-2. > > Looking at the failure in the debugger would really help us, for example > with a stack trace. > > > > Thanks, > > > > Matt > I'm trying to use the debugger but I'm having problems with missing symbols and libraries, I send you the output at the end of the run, maybe it helps, meanwhile I'll keep trying to get the debugger stack trace. The error that stops the program occurs when I run it in separate nodes, because when I run the code in the login node it does not give any problem. [296]PETSC ERROR: Caught signal number 7 BUS: Bus Error, possibly illegal memory access slurmstepd: error: Detected 14 oom-kill event(s) in step 2596337.0 cgroup. Some of your processes may have been killed by the cgroup out-of-memory handler. [296]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [296]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [296]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [296]PETSC ERROR: likely location of problem given in stack below [296]PETSC ERROR: --------------------- Stack Frames ------------------------------------ Fatal error in PMPI_Waitall: Other MPI error, error stack: PMPI_Waitall(405)...............: MPI_Waitall(count=319, req_array=0x1f44940, status_array=0x1f3a9c0) failed MPIR_Waitall_impl(221)..........: fail failed PMPIDI_CH3I_Progress(623).......: fail failed pkt_RTS_handler(317)............: fail failed do_cts(662).....................: fail failed MPID_nem_lmt_dcp_start_recv(302): fail failed dcp_recv(165)...................: Internal MPI error! Cannot read from remote process Two workarounds have been identified for this issue: 1) Enable ptrace for non-root users with: echo 0 | sudo tee /proc/sys/kernel/yama/ptrace_scope 2) Or, use: I_MPI_SHM_LMT=shm [296]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [296]PETSC ERROR: INSTEAD the line number of the start of the function [296]PETSC ERROR: is given. [296]PETSC ERROR: [296] MatCreateSubMatrices_MPIAIJ_Local line 2100 /opt/lib/petsc-3.9.3/src/mat/impls/aij/mpi/mpiov.c [296]PETSC ERROR: [296] MatCreateSubMatrices_MPIAIJ line 1977 /opt/lib/petsc-3.9.3/src/mat/impls/aij/mpi/mpiov.c [296]PETSC ERROR: [296] MatCreateSubMatrices line 6693 /opt/lib/petsc-3.9.3/src/mat/interface/matrix.c [296]PETSC ERROR: [296] MatIsTranspose_MPIAIJ line 1019 /opt/lib/petsc-3.9.3/src/mat/impls/aij/mpi/mpiaij.c [296]PETSC ERROR: [296] MatIsSymmetric_MPIAIJ line 1053 /opt/lib/petsc-3.9.3/src/mat/impls/aij/mpi/mpiaij.c [296]PETSC ERROR: [296] MatIsSymmetric line 8461 /opt/lib/petsc-3.9.3/src/mat/interface/matrix.c [296]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From tempohoper at gmail.com Fri Nov 16 03:22:09 2018 From: tempohoper at gmail.com (Sal Am) Date: Fri, 16 Nov 2018 09:22:09 +0000 Subject: [petsc-users] Solving complex linear sparse matrix in parallel + external library Message-ID: Hi, I have a few questions: 1. The following issue/misunderstanding: My code reads in two files one PETSc vector and one PETSc matrix (b and A from Ax=b, size ~65000x65000). and then calls KSP solver to solve it by running the following in the terminal: mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu -pc_factor_mat_solver mumps Now mumps is supposed to work in parallel and complex, but the code is not solved in parallel it seems. It just prints the result twice. Adding -log_view gives me "./SolveSys on a linux-opt named F8434 with 1 processor..." printed twice. 2. Using iterative solvers, I am having difficulty getting convergence. I found that there is a way to set the maximum number of iterations, but is there a minimum I can increase? 3. The residual is not computed when using direct external solvers. What is proper PETSc way of doing this? -----------------------------------------------------The code--------------------------------------- #include #include int main(int argc,char **args) { Vec x,b; /* approx solution, RHS */ Mat A; /* linear system matrix */ KSP ksp; /* linear solver context */ PetscReal norm; /* norm of solution error */ PC pc; PetscMPIInt rank, size; PetscViewer viewer; PetscInt its, i; PetscErrorCode ierr; PetscScalar *xa; PetscBool flg = PETSC_FALSE; ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return ierr; MPI_Comm_rank(PETSC_COMM_WORLD,&rank); MPI_Comm_size(PETSC_COMM_WORLD,&size); #if !defined(PETSC_USE_COMPLEX) SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex numbers"); #endif /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Compute the matrix and right-hand-side vector that define the linear system, Ax = b. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */ ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from Vector_b.dat ...\n");CHKERRQ(ierr); ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr); ierr = VecLoad(b,viewer); CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from Matrix_A.dat ...\n");CHKERRQ(ierr); ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); ierr = MatLoad(A,viewer);CHKERRQ(ierr); ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); ierr = VecDuplicate(b,&x);CHKERRQ(ierr); /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Create the linear solver and set various options - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */ PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n"); ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n"); ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr); ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); ierr = PCSetType(pc, PCLU);CHKERRQ(ierr); ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Solve the linear system - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */ ierr = KSPSetUp(ksp);CHKERRQ(ierr); ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr); ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); PetscPrintf(PETSC_COMM_WORLD, "Solved"); /* Free work space. All PETSc objects should be destroyed when they are no longer needed. */ ierr = KSPDestroy(&ksp);CHKERRQ(ierr); ierr = VecDestroy(&x);CHKERRQ(ierr); ierr = VecDestroy(&b);CHKERRQ(ierr); ierr = MatDestroy(&A);CHKERRQ(ierr); ierr = PetscFinalize(); return ierr; } Kind regards, Sal -------------- next part -------------- An HTML attachment was scrubbed... URL: From manuel.jung at hs.uni-hamburg.de Fri Nov 16 03:43:43 2018 From: manuel.jung at hs.uni-hamburg.de (Manuel Jung) Date: Fri, 16 Nov 2018 10:43:43 +0100 Subject: [petsc-users] Solving complex linear sparse matrix in parallel + external library In-Reply-To: References: Message-ID: Hi, I can comment on 2): Petsc does not directly allow to set the minimum iterations, but you can define your own Convergence Routine (See the manual). You can define the routine similar to this (Fortran) code: SUBROUTINE MyConvergenceTest(ksp, it, rnorm, reason, ctx, ierr) use PetscInterfaces use Cool_petsc use Cool_data implicit none type(tKSP) :: ksp PetscInt it PetscFortranAddr ctx PetscErrorCode ierr PetscReal rnorm KSPConvergedReason :: reason call KSPConvergedDefault(ksp,it,rnorm,reason,ctx,ierr) IF(it.lt.cool_minits) THEN ierr = 0 reason = 0 END IF END SUBROUTINE MyConvergenceTest You might need to translate that to C in your case and don't forget tell to tell petsc about your custom Convergence test. Cheers, Manuel Am Fr., 16. Nov. 2018 um 10:23 Uhr schrieb Sal Am via petsc-users < petsc-users at mcs.anl.gov>: > Hi, > > I have a few questions: > > 1. The following issue/misunderstanding: > My code reads in two files one PETSc vector and one PETSc matrix (b and A > from Ax=b, size ~65000x65000). > and then calls KSP solver to solve it by running the following in the > terminal: > > mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu > -pc_factor_mat_solver mumps > > Now mumps is supposed to work in parallel and complex, but the code is not > solved in parallel it seems. It just prints the result twice. Adding > -log_view gives me > > "./SolveSys on a linux-opt named F8434 with 1 processor..." printed twice. > > 2. Using iterative solvers, I am having difficulty getting convergence. I > found that there is a way to set the maximum number of iterations, but is > there a minimum I can increase? > > 3. The residual is not computed when using direct external solvers. What > is proper PETSc way of doing this? > -----------------------------------------------------The > code--------------------------------------- > #include > #include > int main(int argc,char **args) > { > Vec x,b; /* approx solution, RHS */ > Mat A; /* linear system matrix */ > KSP ksp; /* linear solver context */ > PetscReal norm; /* norm of solution error */ > PC pc; > PetscMPIInt rank, size; > PetscViewer viewer; > PetscInt its, i; > PetscErrorCode ierr; > PetscScalar *xa; > PetscBool flg = PETSC_FALSE; > > ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return ierr; > MPI_Comm_rank(PETSC_COMM_WORLD,&rank); > MPI_Comm_size(PETSC_COMM_WORLD,&size); > > #if !defined(PETSC_USE_COMPLEX) > SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex numbers"); > #endif > /* > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > Compute the matrix and right-hand-side vector that define > the linear system, Ax = b. > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > */ > ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from > Vector_b.dat ...\n");CHKERRQ(ierr); > ierr = > PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); > ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr); > ierr = VecLoad(b,viewer); CHKERRQ(ierr); > > ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from > Matrix_A.dat ...\n");CHKERRQ(ierr); > ierr = > PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); > ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); > ierr = MatLoad(A,viewer);CHKERRQ(ierr); > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > > ierr = VecDuplicate(b,&x);CHKERRQ(ierr); > > /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > Create the linear solver and set various options > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > */ > PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n"); > ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); > > > PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n"); > ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr); > > ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); > ierr = PCSetType(pc, PCLU);CHKERRQ(ierr); > ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); > > /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > Solve the linear system > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */ > ierr = KSPSetUp(ksp);CHKERRQ(ierr); > ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr); > ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); > PetscPrintf(PETSC_COMM_WORLD, "Solved"); > > /* > Free work space. All PETSc objects should be destroyed when they > are no longer needed. > */ > ierr = KSPDestroy(&ksp);CHKERRQ(ierr); > ierr = VecDestroy(&x);CHKERRQ(ierr); > ierr = VecDestroy(&b);CHKERRQ(ierr); > ierr = MatDestroy(&A);CHKERRQ(ierr); > ierr = PetscFinalize(); > return ierr; > } > > > Kind regards, > Sal > -- Manuel Jung, Dr. rer. nat. Hamburger Sternwarte Universit?t Hamburg Gojenbergsweg 112 21029 Hamburg -------------- next part -------------- An HTML attachment was scrubbed... URL: From niko.karin at gmail.com Fri Nov 16 04:33:50 2018 From: niko.karin at gmail.com (Karin&NiKo) Date: Fri, 16 Nov 2018 11:33:50 +0100 Subject: [petsc-users] GAMG Parallel Performance In-Reply-To: References: <62EA5AF0-B8AC-4A46-AC1E-3CEC8C4A1386@anl.gov> Message-ID: Dear PETSc team, I have run the same test on the same number of processes as before (1000, 1500 and 2000) but by increasing the number of nodes. The results are much better! If I focus on the KSPSolve event, I have the following timings: 1000 => 1.2681e+02 1500 => 8.7030e+01 2000 => 7.8904e+01 The parallel efficiency between 1000 and 1500 is around to 96% but it decreases drastically when using 2000 processes. I think my problem is too small and the communications begin to be important. I have an extra question : in the profiling section, what is exactly measured in "Time (sec): " ? I wonder if it is the time between PetscInitialize and PetscFinalize? Thanks again for your help, Nicolas Le ven. 16 nov. 2018 ? 00:24, Karin&NiKo a ?crit : > Ok. I will do that soon and I will let you know. > Thanks again, > Nicolas > > Le jeu. 15 nov. 2018 20:50, Smith, Barry F. a ?crit : > >> >> >> > On Nov 15, 2018, at 1:02 PM, Mark Adams wrote: >> > >> > There is a lot of load imbalance in VecMAXPY also. The partitioning >> could be bad and if not its the machine. >> >> >> > >> > On Thu, Nov 15, 2018 at 1:56 PM Smith, Barry F. via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> > >> > Something is odd about your configuration. Just consider the time >> for VecMAXPY which is an embarrassingly parallel operation. On 1000 MPI >> processes it produces >> > >> > Time >> >> flop rate >> > VecMAXPY 575 1.0 8.4132e-01 1.5 1.36e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,600,021 >> > >> > on 1500 processes it produces >> > >> > VecMAXPY 583 1.0 1.0786e+00 3.4 9.38e+08 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,289,187 >> > >> > that is it actually takes longer (the time goes from .84 seconds to >> 1.08 seconds and the flop rate from 1,600,021 down to 1,289,187) You would >> never expect this kind of behavior >> > >> > and on 2000 processes it produces >> > >> > VecMAXPY 583 1.0 7.1103e-01 2.7 7.03e+08 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,955,563 >> > >> > so it speeds up again but not by very much. This is very mysterious and >> not what you would expect. >> > >> > I'm inclined to believe something is out of whack on your computer, >> are you sure all nodes on the computer are equivalent? Same processors, >> same clock speeds? What happens if you run the 1000 process case several >> times, do you get very similar numbers for VecMAXPY()? You should but I am >> guessing you may not. >> > >> > Barry >> > >> > Note that this performance issue doesn't really have anything to do >> with the preconditioner you are using. >> > >> > >> > >> > >> > >> > > On Nov 15, 2018, at 10:50 AM, Karin&NiKo via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> > > >> > > Dear PETSc team, >> > > >> > > I am solving a linear transient dynamic problem, based on a >> discretization with finite elements. To do that, I am using FGMRES with >> GAMG as a preconditioner. I consider here 10 time steps. >> > > The problem has round to 118e6 dof and I am running on 1000, 1500 and >> 2000 procs. So I have something like 100e3, 78e3 and 50e3 dof/proc. >> > > I notice that the performance deteriorates when I increase the number >> of processes. >> > > You can find as attached file the log_view of the execution and the >> detailled definition of the KSP. >> > > >> > > Is the problem too small to run on that number of processes or is >> there something wrong with my use of GAMG? >> > > >> > > I thank you in advance for your help, >> > > Nicolas >> > > >> >> > >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 16 05:51:29 2018 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 16 Nov 2018 06:51:29 -0500 Subject: [petsc-users] GAMG Parallel Performance In-Reply-To: References: <62EA5AF0-B8AC-4A46-AC1E-3CEC8C4A1386@anl.gov> Message-ID: On Fri, Nov 16, 2018 at 5:35 AM Karin&NiKo via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear PETSc team, > > I have run the same test on the same number of processes as before (1000, > 1500 and 2000) but by increasing the number of nodes. The results are much > better! > If I focus on the KSPSolve event, I have the following timings: > 1000 => 1.2681e+02 > 1500 => 8.7030e+01 > 2000 => 7.8904e+01 > The parallel efficiency between 1000 and 1500 is around to 96% but it > decreases drastically when using 2000 processes. I think my problem is too > small and the communications begin to be important. > > I have an extra question : in the profiling section, what is exactly > measured in "Time (sec): " ? I wonder if it is the time between > PetscInitialize and PetscFinalize? > Yep. Also, your communication could get more expensive at the 2000 level by including another cabinet or something. Thanks, Matt > Thanks again for your help, > Nicolas > > > Le ven. 16 nov. 2018 ? 00:24, Karin&NiKo a ?crit : > >> Ok. I will do that soon and I will let you know. >> Thanks again, >> Nicolas >> >> Le jeu. 15 nov. 2018 20:50, Smith, Barry F. a >> ?crit : >> >>> >>> >>> > On Nov 15, 2018, at 1:02 PM, Mark Adams wrote: >>> > >>> > There is a lot of load imbalance in VecMAXPY also. The partitioning >>> could be bad and if not its the machine. >>> >>> >>> > >>> > On Thu, Nov 15, 2018 at 1:56 PM Smith, Barry F. via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> > >>> > Something is odd about your configuration. Just consider the time >>> for VecMAXPY which is an embarrassingly parallel operation. On 1000 MPI >>> processes it produces >>> > >>> > Time >>> >>> flop rate >>> > VecMAXPY 575 1.0 8.4132e-01 1.5 1.36e+09 1.0 0.0e+00 >>> 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,600,021 >>> > >>> > on 1500 processes it produces >>> > >>> > VecMAXPY 583 1.0 1.0786e+00 3.4 9.38e+08 1.0 0.0e+00 >>> 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,289,187 >>> > >>> > that is it actually takes longer (the time goes from .84 seconds to >>> 1.08 seconds and the flop rate from 1,600,021 down to 1,289,187) You would >>> never expect this kind of behavior >>> > >>> > and on 2000 processes it produces >>> > >>> > VecMAXPY 583 1.0 7.1103e-01 2.7 7.03e+08 1.0 0.0e+00 >>> 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1,955,563 >>> > >>> > so it speeds up again but not by very much. This is very mysterious >>> and not what you would expect. >>> > >>> > I'm inclined to believe something is out of whack on your computer, >>> are you sure all nodes on the computer are equivalent? Same processors, >>> same clock speeds? What happens if you run the 1000 process case several >>> times, do you get very similar numbers for VecMAXPY()? You should but I am >>> guessing you may not. >>> > >>> > Barry >>> > >>> > Note that this performance issue doesn't really have anything to do >>> with the preconditioner you are using. >>> > >>> > >>> > >>> > >>> > >>> > > On Nov 15, 2018, at 10:50 AM, Karin&NiKo via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> > > >>> > > Dear PETSc team, >>> > > >>> > > I am solving a linear transient dynamic problem, based on a >>> discretization with finite elements. To do that, I am using FGMRES with >>> GAMG as a preconditioner. I consider here 10 time steps. >>> > > The problem has round to 118e6 dof and I am running on 1000, 1500 >>> and 2000 procs. So I have something like 100e3, 78e3 and 50e3 dof/proc. >>> > > I notice that the performance deteriorates when I increase the >>> number of processes. >>> > > You can find as attached file the log_view of the execution and the >>> detailled definition of the KSP. >>> > > >>> > > Is the problem too small to run on that number of processes or is >>> there something wrong with my use of GAMG? >>> > > >>> > > I thank you in advance for your help, >>> > > Nicolas >>> > > >>> >>> > >>> >>> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 16 06:00:26 2018 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 16 Nov 2018 07:00:26 -0500 Subject: [petsc-users] Solving complex linear sparse matrix in parallel + external library In-Reply-To: References: Message-ID: On Fri, Nov 16, 2018 at 4:23 AM Sal Am via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hi, > > I have a few questions: > > 1. The following issue/misunderstanding: > My code reads in two files one PETSc vector and one PETSc matrix (b and A > from Ax=b, size ~65000x65000). > and then calls KSP solver to solve it by running the following in the > terminal: > > mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu > -pc_factor_mat_solver mumps > > Now mumps is supposed to work in parallel and complex, but the code is not > solved in parallel it seems. It just prints the result twice. Adding > -log_view gives me > > "./SolveSys on a linux-opt named F8434 with 1 processor..." printed twice. > This can happen if you use an 'mpiexec' which is from a different MPI than the one you compiled PETSc with. > 2. Using iterative solvers, I am having difficulty getting convergence. I > found that there is a way to set the maximum number of iterations, but is > there a minimum I can increase? > Why would you want a minimum iterations? Why not set a tolerance and a max? What would you achieve with a minimum? > 3. The residual is not computed when using direct external solvers. What > is proper PETSc way of doing this? > Instead of 'preonly', where you do not want a residual, use 'richardson' or 'gmres' with a max of 1 iterate (-ksp_max_it 1) Thanks, Matt > -----------------------------------------------------The > code--------------------------------------- > #include > #include > int main(int argc,char **args) > { > Vec x,b; /* approx solution, RHS */ > Mat A; /* linear system matrix */ > KSP ksp; /* linear solver context */ > PetscReal norm; /* norm of solution error */ > PC pc; > PetscMPIInt rank, size; > PetscViewer viewer; > PetscInt its, i; > PetscErrorCode ierr; > PetscScalar *xa; > PetscBool flg = PETSC_FALSE; > > ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return ierr; > MPI_Comm_rank(PETSC_COMM_WORLD,&rank); > MPI_Comm_size(PETSC_COMM_WORLD,&size); > > #if !defined(PETSC_USE_COMPLEX) > SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex numbers"); > #endif > /* > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > Compute the matrix and right-hand-side vector that define > the linear system, Ax = b. > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > */ > ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from > Vector_b.dat ...\n");CHKERRQ(ierr); > ierr = > PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); > ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr); > ierr = VecLoad(b,viewer); CHKERRQ(ierr); > > ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from > Matrix_A.dat ...\n");CHKERRQ(ierr); > ierr = > PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); > ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); > ierr = MatLoad(A,viewer);CHKERRQ(ierr); > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > > ierr = VecDuplicate(b,&x);CHKERRQ(ierr); > > /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > Create the linear solver and set various options > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > */ > PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n"); > ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); > > > PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n"); > ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr); > > ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); > ierr = PCSetType(pc, PCLU);CHKERRQ(ierr); > ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); > > /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > Solve the linear system > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */ > ierr = KSPSetUp(ksp);CHKERRQ(ierr); > ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr); > ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); > PetscPrintf(PETSC_COMM_WORLD, "Solved"); > > /* > Free work space. All PETSc objects should be destroyed when they > are no longer needed. > */ > ierr = KSPDestroy(&ksp);CHKERRQ(ierr); > ierr = VecDestroy(&x);CHKERRQ(ierr); > ierr = VecDestroy(&b);CHKERRQ(ierr); > ierr = MatDestroy(&A);CHKERRQ(ierr); > ierr = PetscFinalize(); > return ierr; > } > > > Kind regards, > Sal > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivan.voznyuk.work at gmail.com Fri Nov 16 09:43:50 2018 From: ivan.voznyuk.work at gmail.com (Ivan Voznyuk) Date: Fri, 16 Nov 2018 16:43:50 +0100 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: Hi, You were totally right: no miracle, parallelization does come from multithreading. We checked Option 1/: played with OMP_NUM_THREADS=1 it changed computational time. So, I reinstalled everything (starting with Ubuntu ending with petsc) and configured the following things: - installed system's ompenmpi - installed Intel MKL Blas / Lapack - configured PETSC as ./configure --with-cc=mpicc --with-fc=mpif90 --with-cxx=mpicxx --with-blas-lapack-dir=/opt/intel/mkl/lib/intel64 --download-scalapack --download-mumps --with-hwloc --with-shared --with-openmp=1 --with-pthread=1 --with-scalar-type=complex hoping that it would take into account blas multithreading - installed petsc4py However, I do not get any parallelization... What I tried to do so far unsuccessfully : - play with OMP_NUM_THREADS - reinstall the system - ldd PETSc.cpython-35m-x86_64-linux-gnu.so yields lld_result.txt (here attached) I noted that libmkl_sequential.so library there. Do you think this is normal? - I found a similar problem reported here: https://lists.mcs.anl.gov/pipermail/petsc-users/2016-March/028803.html To solve this problem, developers recommended to replace -lmkl_sequential to -lmkl_intel_thread options in PETSC_ARCH/lib/conf/petscvariables. However, I did not find something that would be named like this (it might be a change of version) - Anyway, I replaced lmkl_sequential to lmkl_intel_thread in every file of PETSC, but it changed nothing. As a result, in the new make.log (here attached ) I have a parameter #define PETSC_HAVE_LIBMKL_SEQUENTIAL 1 and option -lmkl_sequential Do you have any idea of what I should change in the initial options in order to obtain the blas multithreding parallelization? Thanks a lot for your help! Ivan On Fri, Nov 16, 2018 at 1:25 AM Dave May wrote: > > > On Thu, 15 Nov 2018 at 17:44, Ivan via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Hi Stefano, >> >> In fact, yes, we look at the htop output (and the resulting computational >> time ofc). >> >> In our code we use MUMPS, which indeed depends on blas / lapack. So I >> think this might be it! >> >> I will definetely check it (I mean the difference between our MUMPS, >> blas, lapack). >> >> If you have an idea of how we can verify on his PC that the source of his >> parallelization does come from BLAS, please do not hesitate to tell me! >> > > Option 1/ > * Set this environment variable > export OMP_NUM_THREADS=1 > * Re-run your "parallel" test. > * If the performance differs (job runs slower) compared with your previous > run where you inferred parallelism was being employed, you can safely > assume that the parallelism observed comes from threads > > Option 2/ > * Re-configure PETSc to use a known BLAS implementation which does not > support threads > * Re-compile PETSc > * Re-run your parallel test > * If the performance differs (job runs slower) compared with your previous > run where you inferred parallelism was being employed, you can safely > assume that the parallelism observed comes from threads > > Option 3/ > * Use a PC which does not depend on BLAS at all, > e.g. -pc_type jacobi -pc_type bjacobi > * If the performance differs (job runs slower) compared with your previous > run where you inferred parallelism was being employed, you can safely > assume that the parallelism observed comes from BLAS + threads > > > >> Thanks! >> >> Ivan >> On 15/11/2018 18:24, Stefano Zampini wrote: >> >> If you say your program is parallel by just looking at the output from >> the top command, you are probably linking against a multithreaded blas >> library >> >> Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < >> petsc-users at mcs.anl.gov> ha scritto: >> >>> On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk < >>> ivan.voznyuk.work at gmail.com> wrote: >>> >>>> Hi Matthew, >>>> >>>> Does it mean that by using just command python3 simple_code.py (without >>>> mpiexec) you *cannot* obtain a parallel execution? >>>> >>> >>> As I wrote before, its not impossible. You could be directly calling >>> PMI, but I do not think you are doing that. >>> >>> >>>> It s been 5 days we are trying to understand with my colleague how he >>>> managed to do so. >>>> It means that by using simply python3 simple_code.py he gets 8 >>>> processors workiing. >>>> By the way, we wrote in his code few lines: >>>> rank = PETSc.COMM_WORLD.Get_rank() >>>> size = PETSc.COMM_WORLD.Get_size() >>>> and we got rank = 0, size = 1 >>>> >>> >>> This is MPI telling you that you are only running on 1 processes. >>> >>> >>>> However, we compilator arrives to KSP.solve(), somehow it turns on 8 >>>> processors. >>>> >>> >>> Why do you think its running on 8 processes? >>> >>> >>>> This problem is solved on his PC in 5-8 sec (in parallel, using *python3 >>>> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but with >>>> the same command *python3 simple_code.py*) >>>> >>> >>> I think its much more likely that there are differences in the solver >>> (use -ksp_view to see exactly what solver was used), then >>> to think it is parallelism. Moreover, you would never ever ever see that >>> much speedup on a laptop since all these computations >>> are bandwidth limited. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> So, conclusion is that on his computer this code works in the same way >>>> as scipy: all the code is executed in sequantial mode, but when it comes to >>>> solution of system of linear equations, it runs on all available >>>> processors. All this with just running python3 my_code.py (without any >>>> mpi-smth) >>>> >>>> Is it an exception / abnormal behavior? I mean, is it something >>>> irregular that you, developers, have never seen? >>>> >>>> Thanks and have a good evening! >>>> Ivan >>>> >>>> P.S. I don't think I know the answer regarding Scipy... >>>> >>>> >>>> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley >>>> wrote: >>>> >>>>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < >>>>> ivan.voznyuk.work at gmail.com> wrote: >>>>> >>>>>> Hi Matthew, >>>>>> Thanks for your reply! >>>>>> >>>>>> Let me precise what I mean by defining few questions: >>>>>> >>>>>> 1. In order to obtain a parallel execution of simple_code.py, do I >>>>>> need to go with mpiexec python3 simple_code.py, or I can just launch >>>>>> python3 simple_code.py? >>>>>> >>>>> >>>>> mpiexec -n 2 python3 simple_code.py >>>>> >>>>> >>>>>> 2. This simple_code.py consists of 2 parts: a) preparation of matrix >>>>>> b) solving the system of linear equations with PETSc. If I launch mpirun >>>>>> (or mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically >>>>>> obtain 8 matrices and 8 systems to solve. However, I need to prepare only >>>>>> one matrix, but launch this code in parallel on 8 processors. >>>>>> >>>>> >>>>> When you create the Mat object, you give it a communicator (here >>>>> PETSC_COMM_WORLD). That allows us to distribute the data. This is all >>>>> covered extensively in the manual and the online tutorials, as well as the >>>>> example code. >>>>> >>>>> >>>>>> In fact, here attached you will find a similar code (scipy_code.py) >>>>>> with only one difference: the system of linear equations is solved with >>>>>> scipy. So when I solve it, I can clearly see that the solution is obtained >>>>>> in a parallel way. However, I do not use the command mpirun (or mpiexec). I >>>>>> just go with python3 scipy_code.py. >>>>>> >>>>> >>>>> Why do you think its running in parallel? >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> In this case, the first part (creation of the sparse matrix) is not >>>>>> parallel, whereas the solution of system is found in a parallel way. >>>>>> So my question is, Do you think that it s possible to have the same >>>>>> behavior with PETSC? And what do I need for this? >>>>>> >>>>>> I am asking this because for my colleague it worked! It means that he >>>>>> launches the simple_code.py on his computer using the command python3 >>>>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a >>>>>> parallel execution of the same code. >>>>>> >>>>>> Thanks for your help! >>>>>> Ivan >>>>>> >>>>>> >>>>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >>>>>>> petsc-users at mcs.anl.gov> wrote: >>>>>>> >>>>>>>> Dear PETSC community, >>>>>>>> >>>>>>>> I have a question regarding the parallel execution of petsc4py. >>>>>>>> >>>>>>>> I have a simple code (here attached simple_code.py) which solves a >>>>>>>> system of linear equations Ax=b using petsc4py. To execute it, I use the >>>>>>>> command python3 simple_code.py which yields a sequential performance. With >>>>>>>> a colleague of my, we launched this code on his computer, and this time the >>>>>>>> execution was in parallel. Although, he used the same command python3 >>>>>>>> simple_code.py (without mpirun, neither mpiexec). >>>>>>>> >>>>>>> I am not sure what you mean. To run MPI programs in parallel, you >>>>>>> need a launcher like mpiexec or mpirun. There are Python programs (like >>>>>>> nemesis) that use the launcher API directly (called PMI), but that is not >>>>>>> part of petsc4py. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc >>>>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv >>>>>>>> >>>>>>>> In order to parallelize it, I have already tried: >>>>>>>> - use 2 different PCs >>>>>>>> - use Ubuntu 16.04, 18.04 >>>>>>>> - use different architectures (arch-linux2-c-debug, >>>>>>>> linux-gnu-c-debug, etc) >>>>>>>> - ofc use different configurations (my present config can be found >>>>>>>> in make.log that I attached here) >>>>>>>> - mpi from mpich, openmpi >>>>>>>> >>>>>>>> Nothing worked. >>>>>>>> >>>>>>>> Do you have any ideas? >>>>>>>> >>>>>>>> Thanks and have a good day, >>>>>>>> Ivan >>>>>>>> >>>>>>>> -- >>>>>>>> Ivan VOZNYUK >>>>>>>> PhD in Computational Electromagnetics >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Ivan VOZNYUK >>>>>> PhD in Computational Electromagnetics >>>>>> +33 (0)6.95.87.04.55 >>>>>> My webpage >>>>>> My LinkedIn >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>>> -- >>>> Ivan VOZNYUK >>>> PhD in Computational Electromagnetics >>>> +33 (0)6.95.87.04.55 >>>> My webpage >>>> My LinkedIn >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> -- Ivan VOZNYUK PhD in Computational Electromagnetics +33 (0)6.95.87.04.55 My webpage My LinkedIn -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- linux-vdso.so.1 => (0x00007ffd5d7c5000) /opt/intel/mkl/lib/intel64/libmkl_core.so (0x00007fee66886000) /opt/intel/mkl/lib/intel64/libmkl_sequential.so (0x00007fee652da000) libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fee650bd000) libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fee64cf3000) libpetsc.so.3.10 => /opt/petsc/petsc-3.10.2/arch-linux2-c-debug/lib/libpetsc.so.3.10 (0x00007fee6292d000) libmpi.so.12 => /usr/lib/libmpi.so.12 (0x00007fee62657000) libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fee62453000) /lib64/ld-linux-x86-64.so.2 (0x00007fee6af61000) libmkl_intel_lp64.so => /opt/intel/mkl/lib/intel64/libmkl_intel_lp64.so (0x00007fee61905000) libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fee615fc000) libhwloc.so.5 => /usr/lib/x86_64-linux-gnu/libhwloc.so.5 (0x00007fee613c2000) libmpi_mpifh.so.12 => /usr/lib/libmpi_mpifh.so.12 (0x00007fee61169000) libgfortran.so.3 => /usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00007fee60e3e000) libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fee60c28000) libgomp.so.1 => /usr/lib/x86_64-linux-gnu/libgomp.so.1 (0x00007fee60a06000) libibverbs.so.1 => /usr/lib/libibverbs.so.1 (0x00007fee607f7000) libopen-rte.so.12 => /usr/lib/libopen-rte.so.12 (0x00007fee6057d000) libopen-pal.so.13 => /usr/lib/libopen-pal.so.13 (0x00007fee602e0000) libnuma.so.1 => /usr/lib/x86_64-linux-gnu/libnuma.so.1 (0x00007fee600d5000) libltdl.so.7 => /usr/lib/x86_64-linux-gnu/libltdl.so.7 (0x00007fee5fecb000) libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007fee5fc8c000) librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fee5fa84000) libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fee5f881000) -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: text/x-log Size: 97813 bytes Desc: not available URL: From Fabian.Jakub at physik.uni-muenchen.de Fri Nov 16 09:52:20 2018 From: Fabian.Jakub at physik.uni-muenchen.de (Fabian.Jakub) Date: Fri, 16 Nov 2018 16:52:20 +0100 Subject: [petsc-users] [SLEPc] Krylov-Schur convergence In-Reply-To: References: <470A7599-766D-4477-A298-473A25B6B71D@dsic.upv.es> <44298504-2FB3-4CC1-94D5-9021915E295E@dsic.upv.es> Message-ID: Concerning your gdb attaching error/missing symbols etc, you might have the issue with ptrace_scope as it is suggested in your petsc output. If you have the possibility to become root on your system you can try again after disabling the security feature. echo 0 | sudo tee /proc/sys/kernel/yama/ptrace_scope On 11/16/18 9:38 AM, Ale Foggia via petsc-users wrote: >> >> One thing you can do is use a symmetric matrix format: -mat_type sbaij >> In this way, your matrix will always be symmetric because only the upper >> triangular part is stored. The drawback is that efficiency will likely >> decrease. >> >> To check symmetry, one (possibly bad) way is to take a bunch of random >> vectors X and check that X'*A*X is symmetric. This can be easily done with >> SLEPc's BVMatProject, see test9.c under >> $SLEPC_DIR/src/sys/classes/bv/examples/tests >> >> Jose >> >> > Is it possible that the asymmetry arises because of a numerical issue? It > starts happening when my numbers are bigger than 2**32. I've checked that > I'm always using PetscInt and I compiled the library with "64 bit > integers". It seems strange to me that it only happens after some point and > that the asymmetry is bigger than 10**-2. > > >>> Looking at the failure in the debugger would really help us, for example >> with a stack trace. >>> >>> Thanks, >>> >>> Matt >> > > I'm trying to use the debugger but I'm having problems with missing symbols > and libraries, I send you the output at the end of the run, maybe it helps, > meanwhile I'll keep trying to get the debugger stack trace. The error that > stops the program occurs when I run it in separate nodes, because when I > run the code in the login node it does not give any problem. > > [296]PETSC ERROR: Caught signal number 7 BUS: Bus Error, possibly illegal > memory access > slurmstepd: error: Detected 14 oom-kill event(s) in step 2596337.0 cgroup. > Some of your processes may have been killed by the cgroup out-of-memory > handler. > [296]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [296]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [296]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS > X to find memory corruption errors > [296]PETSC ERROR: likely location of problem given in stack below > [296]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > Fatal error in PMPI_Waitall: Other MPI error, error stack: > PMPI_Waitall(405)...............: MPI_Waitall(count=319, > req_array=0x1f44940, status_array=0x1f3a9c0) failed > MPIR_Waitall_impl(221)..........: fail failed > PMPIDI_CH3I_Progress(623).......: fail failed > pkt_RTS_handler(317)............: fail failed > do_cts(662).....................: fail failed > MPID_nem_lmt_dcp_start_recv(302): fail failed > dcp_recv(165)...................: Internal MPI error! Cannot read from > remote process > Two workarounds have been identified for this issue: > 1) Enable ptrace for non-root users with: > echo 0 | sudo tee /proc/sys/kernel/yama/ptrace_scope > 2) Or, use: > I_MPI_SHM_LMT=shm > > [296]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [296]PETSC ERROR: INSTEAD the line number of the start of the function > [296]PETSC ERROR: is given. > [296]PETSC ERROR: [296] MatCreateSubMatrices_MPIAIJ_Local line 2100 > /opt/lib/petsc-3.9.3/src/mat/impls/aij/mpi/mpiov.c > [296]PETSC ERROR: [296] MatCreateSubMatrices_MPIAIJ line 1977 > /opt/lib/petsc-3.9.3/src/mat/impls/aij/mpi/mpiov.c > [296]PETSC ERROR: [296] MatCreateSubMatrices line 6693 > /opt/lib/petsc-3.9.3/src/mat/interface/matrix.c > [296]PETSC ERROR: [296] MatIsTranspose_MPIAIJ line 1019 > /opt/lib/petsc-3.9.3/src/mat/impls/aij/mpi/mpiaij.c > [296]PETSC ERROR: [296] MatIsSymmetric_MPIAIJ line 1053 > /opt/lib/petsc-3.9.3/src/mat/impls/aij/mpi/mpiaij.c > [296]PETSC ERROR: [296] MatIsSymmetric line 8461 > /opt/lib/petsc-3.9.3/src/mat/interface/matrix.c > [296]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > From balay at mcs.anl.gov Fri Nov 16 11:11:38 2018 From: balay at mcs.anl.gov (Balay, Satish) Date: Fri, 16 Nov 2018 17:11:38 +0000 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: Yes PETSc prefers sequential MKL - as MPI handles parallelism. One way to trick petsc configure to use threaded MKL is to enable pardiso. i.e: --with-blaslapack-dir=/opt/intel/mkl --with-mkl_pardiso-dir=/opt/intel/mkl http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2018/11/15/configure_master_arch-pardiso_grind.log BLAS/LAPACK: -Wl,-rpath,/soft/com/packages/intel/16/u3/mkl/lib/intel64 -L/soft/com/packages/intel/16/u3/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_core -lmkl_intel_thread -liomp5 -ldl -lpthread Or you can manually specify the correct MKL library list [with threading] via --with-blaslapack-lib option. Satish On Fri, 16 Nov 2018, Ivan Voznyuk via petsc-users wrote: > Hi, > You were totally right: no miracle, parallelization does come from > multithreading. We checked Option 1/: played with OMP_NUM_THREADS=1 it > changed computational time. > > So, I reinstalled everything (starting with Ubuntu ending with petsc) and > configured the following things: > > - installed system's ompenmpi > - installed Intel MKL Blas / Lapack > - configured PETSC as ./configure --with-cc=mpicc --with-fc=mpif90 > --with-cxx=mpicxx --with-blas-lapack-dir=/opt/intel/mkl/lib/intel64 > --download-scalapack --download-mumps --with-hwloc --with-shared > --with-openmp=1 --with-pthread=1 --with-scalar-type=complex > hoping that it would take into account blas multithreading > - installed petsc4py > > However, I do not get any parallelization... > What I tried to do so far unsuccessfully : > - play with OMP_NUM_THREADS > - reinstall the system > - ldd PETSc.cpython-35m-x86_64-linux-gnu.so yields lld_result.txt (here > attached) > I noted that libmkl_sequential.so library there. Do you think this is > normal? > - I found a similar problem reported here: > https://lists.mcs.anl.gov/pipermail/petsc-users/2016-March/028803.html To > solve this problem, developers recommended to replace -lmkl_sequential to > -lmkl_intel_thread options in PETSC_ARCH/lib/conf/petscvariables. However, > I did not find something that would be named like this (it might be a > change of version) > - Anyway, I replaced lmkl_sequential to lmkl_intel_thread in every file of > PETSC, but it changed nothing. > > As a result, in the new make.log (here attached ) I have a parameter > #define PETSC_HAVE_LIBMKL_SEQUENTIAL 1 and option -lmkl_sequential > > Do you have any idea of what I should change in the initial options in > order to obtain the blas multithreding parallelization? > > Thanks a lot for your help! > > Ivan > > > > > > > On Fri, Nov 16, 2018 at 1:25 AM Dave May wrote: > > > > > > > On Thu, 15 Nov 2018 at 17:44, Ivan via petsc-users < > > petsc-users at mcs.anl.gov> wrote: > > > >> Hi Stefano, > >> > >> In fact, yes, we look at the htop output (and the resulting computational > >> time ofc). > >> > >> In our code we use MUMPS, which indeed depends on blas / lapack. So I > >> think this might be it! > >> > >> I will definetely check it (I mean the difference between our MUMPS, > >> blas, lapack). > >> > >> If you have an idea of how we can verify on his PC that the source of his > >> parallelization does come from BLAS, please do not hesitate to tell me! > >> > > > > Option 1/ > > * Set this environment variable > > export OMP_NUM_THREADS=1 > > * Re-run your "parallel" test. > > * If the performance differs (job runs slower) compared with your previous > > run where you inferred parallelism was being employed, you can safely > > assume that the parallelism observed comes from threads > > > > Option 2/ > > * Re-configure PETSc to use a known BLAS implementation which does not > > support threads > > * Re-compile PETSc > > * Re-run your parallel test > > * If the performance differs (job runs slower) compared with your previous > > run where you inferred parallelism was being employed, you can safely > > assume that the parallelism observed comes from threads > > > > Option 3/ > > * Use a PC which does not depend on BLAS at all, > > e.g. -pc_type jacobi -pc_type bjacobi > > * If the performance differs (job runs slower) compared with your previous > > run where you inferred parallelism was being employed, you can safely > > assume that the parallelism observed comes from BLAS + threads > > > > > > > >> Thanks! > >> > >> Ivan > >> On 15/11/2018 18:24, Stefano Zampini wrote: > >> > >> If you say your program is parallel by just looking at the output from > >> the top command, you are probably linking against a multithreaded blas > >> library > >> > >> Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < > >> petsc-users at mcs.anl.gov> ha scritto: > >> > >>> On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk < > >>> ivan.voznyuk.work at gmail.com> wrote: > >>> > >>>> Hi Matthew, > >>>> > >>>> Does it mean that by using just command python3 simple_code.py (without > >>>> mpiexec) you *cannot* obtain a parallel execution? > >>>> > >>> > >>> As I wrote before, its not impossible. You could be directly calling > >>> PMI, but I do not think you are doing that. > >>> > >>> > >>>> It s been 5 days we are trying to understand with my colleague how he > >>>> managed to do so. > >>>> It means that by using simply python3 simple_code.py he gets 8 > >>>> processors workiing. > >>>> By the way, we wrote in his code few lines: > >>>> rank = PETSc.COMM_WORLD.Get_rank() > >>>> size = PETSc.COMM_WORLD.Get_size() > >>>> and we got rank = 0, size = 1 > >>>> > >>> > >>> This is MPI telling you that you are only running on 1 processes. > >>> > >>> > >>>> However, we compilator arrives to KSP.solve(), somehow it turns on 8 > >>>> processors. > >>>> > >>> > >>> Why do you think its running on 8 processes? > >>> > >>> > >>>> This problem is solved on his PC in 5-8 sec (in parallel, using *python3 > >>>> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but with > >>>> the same command *python3 simple_code.py*) > >>>> > >>> > >>> I think its much more likely that there are differences in the solver > >>> (use -ksp_view to see exactly what solver was used), then > >>> to think it is parallelism. Moreover, you would never ever ever see that > >>> much speedup on a laptop since all these computations > >>> are bandwidth limited. > >>> > >>> Thanks, > >>> > >>> Matt > >>> > >>> > >>>> So, conclusion is that on his computer this code works in the same way > >>>> as scipy: all the code is executed in sequantial mode, but when it comes to > >>>> solution of system of linear equations, it runs on all available > >>>> processors. All this with just running python3 my_code.py (without any > >>>> mpi-smth) > >>>> > >>>> Is it an exception / abnormal behavior? I mean, is it something > >>>> irregular that you, developers, have never seen? > >>>> > >>>> Thanks and have a good evening! > >>>> Ivan > >>>> > >>>> P.S. I don't think I know the answer regarding Scipy... > >>>> > >>>> > >>>> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley > >>>> wrote: > >>>> > >>>>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < > >>>>> ivan.voznyuk.work at gmail.com> wrote: > >>>>> > >>>>>> Hi Matthew, > >>>>>> Thanks for your reply! > >>>>>> > >>>>>> Let me precise what I mean by defining few questions: > >>>>>> > >>>>>> 1. In order to obtain a parallel execution of simple_code.py, do I > >>>>>> need to go with mpiexec python3 simple_code.py, or I can just launch > >>>>>> python3 simple_code.py? > >>>>>> > >>>>> > >>>>> mpiexec -n 2 python3 simple_code.py > >>>>> > >>>>> > >>>>>> 2. This simple_code.py consists of 2 parts: a) preparation of matrix > >>>>>> b) solving the system of linear equations with PETSc. If I launch mpirun > >>>>>> (or mpiexec) -np 8 python3 simple_code.py, I suppose that I will basically > >>>>>> obtain 8 matrices and 8 systems to solve. However, I need to prepare only > >>>>>> one matrix, but launch this code in parallel on 8 processors. > >>>>>> > >>>>> > >>>>> When you create the Mat object, you give it a communicator (here > >>>>> PETSC_COMM_WORLD). That allows us to distribute the data. This is all > >>>>> covered extensively in the manual and the online tutorials, as well as the > >>>>> example code. > >>>>> > >>>>> > >>>>>> In fact, here attached you will find a similar code (scipy_code.py) > >>>>>> with only one difference: the system of linear equations is solved with > >>>>>> scipy. So when I solve it, I can clearly see that the solution is obtained > >>>>>> in a parallel way. However, I do not use the command mpirun (or mpiexec). I > >>>>>> just go with python3 scipy_code.py. > >>>>>> > >>>>> > >>>>> Why do you think its running in parallel? > >>>>> > >>>>> Thanks, > >>>>> > >>>>> Matt > >>>>> > >>>>> > >>>>>> In this case, the first part (creation of the sparse matrix) is not > >>>>>> parallel, whereas the solution of system is found in a parallel way. > >>>>>> So my question is, Do you think that it s possible to have the same > >>>>>> behavior with PETSC? And what do I need for this? > >>>>>> > >>>>>> I am asking this because for my colleague it worked! It means that he > >>>>>> launches the simple_code.py on his computer using the command python3 > >>>>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he obtains a > >>>>>> parallel execution of the same code. > >>>>>> > >>>>>> Thanks for your help! > >>>>>> Ivan > >>>>>> > >>>>>> > >>>>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley > >>>>>> wrote: > >>>>>> > >>>>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < > >>>>>>> petsc-users at mcs.anl.gov> wrote: > >>>>>>> > >>>>>>>> Dear PETSC community, > >>>>>>>> > >>>>>>>> I have a question regarding the parallel execution of petsc4py. > >>>>>>>> > >>>>>>>> I have a simple code (here attached simple_code.py) which solves a > >>>>>>>> system of linear equations Ax=b using petsc4py. To execute it, I use the > >>>>>>>> command python3 simple_code.py which yields a sequential performance. With > >>>>>>>> a colleague of my, we launched this code on his computer, and this time the > >>>>>>>> execution was in parallel. Although, he used the same command python3 > >>>>>>>> simple_code.py (without mpirun, neither mpiexec). > >>>>>>>> > >>>>>>> I am not sure what you mean. To run MPI programs in parallel, you > >>>>>>> need a launcher like mpiexec or mpirun. There are Python programs (like > >>>>>>> nemesis) that use the launcher API directly (called PMI), but that is not > >>>>>>> part of petsc4py. > >>>>>>> > >>>>>>> Thanks, > >>>>>>> > >>>>>>> Matt > >>>>>>> > >>>>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, PETSc > >>>>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in virtualenv > >>>>>>>> > >>>>>>>> In order to parallelize it, I have already tried: > >>>>>>>> - use 2 different PCs > >>>>>>>> - use Ubuntu 16.04, 18.04 > >>>>>>>> - use different architectures (arch-linux2-c-debug, > >>>>>>>> linux-gnu-c-debug, etc) > >>>>>>>> - ofc use different configurations (my present config can be found > >>>>>>>> in make.log that I attached here) > >>>>>>>> - mpi from mpich, openmpi > >>>>>>>> > >>>>>>>> Nothing worked. > >>>>>>>> > >>>>>>>> Do you have any ideas? > >>>>>>>> > >>>>>>>> Thanks and have a good day, > >>>>>>>> Ivan > >>>>>>>> > >>>>>>>> -- > >>>>>>>> Ivan VOZNYUK > >>>>>>>> PhD in Computational Electromagnetics > >>>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> -- > >>>>>>> What most experimenters take for granted before they begin their > >>>>>>> experiments is infinitely more interesting than any results to which their > >>>>>>> experiments lead. > >>>>>>> -- Norbert Wiener > >>>>>>> > >>>>>>> https://www.cse.buffalo.edu/~knepley/ > >>>>>>> > >>>>>>> > >>>>>> > >>>>>> > >>>>>> -- > >>>>>> Ivan VOZNYUK > >>>>>> PhD in Computational Electromagnetics > >>>>>> +33 (0)6.95.87.04.55 > >>>>>> My webpage > >>>>>> My LinkedIn > >>>>>> > >>>>> > >>>>> > >>>>> -- > >>>>> What most experimenters take for granted before they begin their > >>>>> experiments is infinitely more interesting than any results to which their > >>>>> experiments lead. > >>>>> -- Norbert Wiener > >>>>> > >>>>> https://www.cse.buffalo.edu/~knepley/ > >>>>> > >>>>> > >>>> > >>>> > >>>> -- > >>>> Ivan VOZNYUK > >>>> PhD in Computational Electromagnetics > >>>> +33 (0)6.95.87.04.55 > >>>> My webpage > >>>> My LinkedIn > >>>> > >>> > >>> > >>> -- > >>> What most experimenters take for granted before they begin their > >>> experiments is infinitely more interesting than any results to which their > >>> experiments lead. > >>> -- Norbert Wiener > >>> > >>> https://www.cse.buffalo.edu/~knepley/ > >>> > >>> > >> > > From ivan.voznyuk.work at gmail.com Fri Nov 16 12:02:44 2018 From: ivan.voznyuk.work at gmail.com (Ivan Voznyuk) Date: Fri, 16 Nov 2018 19:02:44 +0100 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: Hi Satish, Thanks for your reply. Bad news... I tested 2 solutions that you proposed, none has worked. 1. --with-blaslapack-dir=/opt/intel/mkl --with-mkl_pardiso-dir=/opt/intel/mkl installed well, without any problems. However, the code is still turning in sequential way. 2. When I changed -lmkl_sequential to -lmkl_intel_thread -liomp, he at first did not find the liomp, so I had to create a symbolic link of libiomp5.so to /lib. At the launching of the .py code I had to go with: export LD_PRELOAD=/opt/intel/mkl/lib/intel64/libmkl_core.so:/opt/intel/mkl/lib/intel64/libmkl_sequential.so and export LD_LIBRARY_PATH=/opt/petsc/petsc1/arch-linux2-c-debug/lib/ But still it does not solve the given problem and code is still running sequentially... May be you have some other ideas? Thanks, Ivan On Fri, Nov 16, 2018 at 6:11 PM Balay, Satish wrote: > Yes PETSc prefers sequential MKL - as MPI handles parallelism. > > One way to trick petsc configure to use threaded MKL is to enable pardiso. > i.e: > > --with-blaslapack-dir=/opt/intel/mkl --with-mkl_pardiso-dir=/opt/intel/mkl > > > http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2018/11/15/configure_master_arch-pardiso_grind.log > > BLAS/LAPACK: -Wl,-rpath,/soft/com/packages/intel/16/u3/mkl/lib/intel64 > -L/soft/com/packages/intel/16/u3/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_core -lmkl_intel_thread -liomp5 -ldl -lpthread > > Or you can manually specify the correct MKL library list [with > threading] via --with-blaslapack-lib option. > > Satish > > On Fri, 16 Nov 2018, Ivan Voznyuk via petsc-users wrote: > > > Hi, > > You were totally right: no miracle, parallelization does come from > > multithreading. We checked Option 1/: played with OMP_NUM_THREADS=1 it > > changed computational time. > > > > So, I reinstalled everything (starting with Ubuntu ending with petsc) and > > configured the following things: > > > > - installed system's ompenmpi > > - installed Intel MKL Blas / Lapack > > - configured PETSC as ./configure --with-cc=mpicc --with-fc=mpif90 > > --with-cxx=mpicxx --with-blas-lapack-dir=/opt/intel/mkl/lib/intel64 > > --download-scalapack --download-mumps --with-hwloc --with-shared > > --with-openmp=1 --with-pthread=1 --with-scalar-type=complex > > hoping that it would take into account blas multithreading > > - installed petsc4py > > > > However, I do not get any parallelization... > > What I tried to do so far unsuccessfully : > > - play with OMP_NUM_THREADS > > - reinstall the system > > - ldd PETSc.cpython-35m-x86_64-linux-gnu.so yields lld_result.txt (here > > attached) > > I noted that libmkl_sequential.so library there. Do you think this is > > normal? > > - I found a similar problem reported here: > > https://lists.mcs.anl.gov/pipermail/petsc-users/2016-March/028803.html > To > > solve this problem, developers recommended to replace -lmkl_sequential to > > -lmkl_intel_thread options in PETSC_ARCH/lib/conf/petscvariables. > However, > > I did not find something that would be named like this (it might be a > > change of version) > > - Anyway, I replaced lmkl_sequential to lmkl_intel_thread in every file > of > > PETSC, but it changed nothing. > > > > As a result, in the new make.log (here attached ) I have a parameter > > #define PETSC_HAVE_LIBMKL_SEQUENTIAL 1 and option -lmkl_sequential > > > > Do you have any idea of what I should change in the initial options in > > order to obtain the blas multithreding parallelization? > > > > Thanks a lot for your help! > > > > Ivan > > > > > > > > > > > > > > On Fri, Nov 16, 2018 at 1:25 AM Dave May > wrote: > > > > > > > > > > > On Thu, 15 Nov 2018 at 17:44, Ivan via petsc-users < > > > petsc-users at mcs.anl.gov> wrote: > > > > > >> Hi Stefano, > > >> > > >> In fact, yes, we look at the htop output (and the resulting > computational > > >> time ofc). > > >> > > >> In our code we use MUMPS, which indeed depends on blas / lapack. So I > > >> think this might be it! > > >> > > >> I will definetely check it (I mean the difference between our MUMPS, > > >> blas, lapack). > > >> > > >> If you have an idea of how we can verify on his PC that the source of > his > > >> parallelization does come from BLAS, please do not hesitate to tell > me! > > >> > > > > > > Option 1/ > > > * Set this environment variable > > > export OMP_NUM_THREADS=1 > > > * Re-run your "parallel" test. > > > * If the performance differs (job runs slower) compared with your > previous > > > run where you inferred parallelism was being employed, you can safely > > > assume that the parallelism observed comes from threads > > > > > > Option 2/ > > > * Re-configure PETSc to use a known BLAS implementation which does not > > > support threads > > > * Re-compile PETSc > > > * Re-run your parallel test > > > * If the performance differs (job runs slower) compared with your > previous > > > run where you inferred parallelism was being employed, you can safely > > > assume that the parallelism observed comes from threads > > > > > > Option 3/ > > > * Use a PC which does not depend on BLAS at all, > > > e.g. -pc_type jacobi -pc_type bjacobi > > > * If the performance differs (job runs slower) compared with your > previous > > > run where you inferred parallelism was being employed, you can safely > > > assume that the parallelism observed comes from BLAS + threads > > > > > > > > > > > >> Thanks! > > >> > > >> Ivan > > >> On 15/11/2018 18:24, Stefano Zampini wrote: > > >> > > >> If you say your program is parallel by just looking at the output from > > >> the top command, you are probably linking against a multithreaded blas > > >> library > > >> > > >> Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < > > >> petsc-users at mcs.anl.gov> ha scritto: > > >> > > >>> On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk < > > >>> ivan.voznyuk.work at gmail.com> wrote: > > >>> > > >>>> Hi Matthew, > > >>>> > > >>>> Does it mean that by using just command python3 simple_code.py > (without > > >>>> mpiexec) you *cannot* obtain a parallel execution? > > >>>> > > >>> > > >>> As I wrote before, its not impossible. You could be directly calling > > >>> PMI, but I do not think you are doing that. > > >>> > > >>> > > >>>> It s been 5 days we are trying to understand with my colleague how > he > > >>>> managed to do so. > > >>>> It means that by using simply python3 simple_code.py he gets 8 > > >>>> processors workiing. > > >>>> By the way, we wrote in his code few lines: > > >>>> rank = PETSc.COMM_WORLD.Get_rank() > > >>>> size = PETSc.COMM_WORLD.Get_size() > > >>>> and we got rank = 0, size = 1 > > >>>> > > >>> > > >>> This is MPI telling you that you are only running on 1 processes. > > >>> > > >>> > > >>>> However, we compilator arrives to KSP.solve(), somehow it turns on 8 > > >>>> processors. > > >>>> > > >>> > > >>> Why do you think its running on 8 processes? > > >>> > > >>> > > >>>> This problem is solved on his PC in 5-8 sec (in parallel, using > *python3 > > >>>> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but > with > > >>>> the same command *python3 simple_code.py*) > > >>>> > > >>> > > >>> I think its much more likely that there are differences in the solver > > >>> (use -ksp_view to see exactly what solver was used), then > > >>> to think it is parallelism. Moreover, you would never ever ever see > that > > >>> much speedup on a laptop since all these computations > > >>> are bandwidth limited. > > >>> > > >>> Thanks, > > >>> > > >>> Matt > > >>> > > >>> > > >>>> So, conclusion is that on his computer this code works in the same > way > > >>>> as scipy: all the code is executed in sequantial mode, but when it > comes to > > >>>> solution of system of linear equations, it runs on all available > > >>>> processors. All this with just running python3 my_code.py (without > any > > >>>> mpi-smth) > > >>>> > > >>>> Is it an exception / abnormal behavior? I mean, is it something > > >>>> irregular that you, developers, have never seen? > > >>>> > > >>>> Thanks and have a good evening! > > >>>> Ivan > > >>>> > > >>>> P.S. I don't think I know the answer regarding Scipy... > > >>>> > > >>>> > > >>>> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley > > >>>> wrote: > > >>>> > > >>>>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < > > >>>>> ivan.voznyuk.work at gmail.com> wrote: > > >>>>> > > >>>>>> Hi Matthew, > > >>>>>> Thanks for your reply! > > >>>>>> > > >>>>>> Let me precise what I mean by defining few questions: > > >>>>>> > > >>>>>> 1. In order to obtain a parallel execution of simple_code.py, do I > > >>>>>> need to go with mpiexec python3 simple_code.py, or I can just > launch > > >>>>>> python3 simple_code.py? > > >>>>>> > > >>>>> > > >>>>> mpiexec -n 2 python3 simple_code.py > > >>>>> > > >>>>> > > >>>>>> 2. This simple_code.py consists of 2 parts: a) preparation of > matrix > > >>>>>> b) solving the system of linear equations with PETSc. If I launch > mpirun > > >>>>>> (or mpiexec) -np 8 python3 simple_code.py, I suppose that I will > basically > > >>>>>> obtain 8 matrices and 8 systems to solve. However, I need to > prepare only > > >>>>>> one matrix, but launch this code in parallel on 8 processors. > > >>>>>> > > >>>>> > > >>>>> When you create the Mat object, you give it a communicator (here > > >>>>> PETSC_COMM_WORLD). That allows us to distribute the data. This is > all > > >>>>> covered extensively in the manual and the online tutorials, as > well as the > > >>>>> example code. > > >>>>> > > >>>>> > > >>>>>> In fact, here attached you will find a similar code > (scipy_code.py) > > >>>>>> with only one difference: the system of linear equations is > solved with > > >>>>>> scipy. So when I solve it, I can clearly see that the solution is > obtained > > >>>>>> in a parallel way. However, I do not use the command mpirun (or > mpiexec). I > > >>>>>> just go with python3 scipy_code.py. > > >>>>>> > > >>>>> > > >>>>> Why do you think its running in parallel? > > >>>>> > > >>>>> Thanks, > > >>>>> > > >>>>> Matt > > >>>>> > > >>>>> > > >>>>>> In this case, the first part (creation of the sparse matrix) is > not > > >>>>>> parallel, whereas the solution of system is found in a parallel > way. > > >>>>>> So my question is, Do you think that it s possible to have the > same > > >>>>>> behavior with PETSC? And what do I need for this? > > >>>>>> > > >>>>>> I am asking this because for my colleague it worked! It means > that he > > >>>>>> launches the simple_code.py on his computer using the command > python3 > > >>>>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he > obtains a > > >>>>>> parallel execution of the same code. > > >>>>>> > > >>>>>> Thanks for your help! > > >>>>>> Ivan > > >>>>>> > > >>>>>> > > >>>>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley < > knepley at gmail.com> > > >>>>>> wrote: > > >>>>>> > > >>>>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < > > >>>>>>> petsc-users at mcs.anl.gov> wrote: > > >>>>>>> > > >>>>>>>> Dear PETSC community, > > >>>>>>>> > > >>>>>>>> I have a question regarding the parallel execution of petsc4py. > > >>>>>>>> > > >>>>>>>> I have a simple code (here attached simple_code.py) which > solves a > > >>>>>>>> system of linear equations Ax=b using petsc4py. To execute it, > I use the > > >>>>>>>> command python3 simple_code.py which yields a sequential > performance. With > > >>>>>>>> a colleague of my, we launched this code on his computer, and > this time the > > >>>>>>>> execution was in parallel. Although, he used the same command > python3 > > >>>>>>>> simple_code.py (without mpirun, neither mpiexec). > > >>>>>>>> > > >>>>>>> I am not sure what you mean. To run MPI programs in parallel, you > > >>>>>>> need a launcher like mpiexec or mpirun. There are Python > programs (like > > >>>>>>> nemesis) that use the launcher API directly (called PMI), but > that is not > > >>>>>>> part of petsc4py. > > >>>>>>> > > >>>>>>> Thanks, > > >>>>>>> > > >>>>>>> Matt > > >>>>>>> > > >>>>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, > PETSc > > >>>>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in > virtualenv > > >>>>>>>> > > >>>>>>>> In order to parallelize it, I have already tried: > > >>>>>>>> - use 2 different PCs > > >>>>>>>> - use Ubuntu 16.04, 18.04 > > >>>>>>>> - use different architectures (arch-linux2-c-debug, > > >>>>>>>> linux-gnu-c-debug, etc) > > >>>>>>>> - ofc use different configurations (my present config can be > found > > >>>>>>>> in make.log that I attached here) > > >>>>>>>> - mpi from mpich, openmpi > > >>>>>>>> > > >>>>>>>> Nothing worked. > > >>>>>>>> > > >>>>>>>> Do you have any ideas? > > >>>>>>>> > > >>>>>>>> Thanks and have a good day, > > >>>>>>>> Ivan > > >>>>>>>> > > >>>>>>>> -- > > >>>>>>>> Ivan VOZNYUK > > >>>>>>>> PhD in Computational Electromagnetics > > >>>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> -- > > >>>>>>> What most experimenters take for granted before they begin their > > >>>>>>> experiments is infinitely more interesting than any results to > which their > > >>>>>>> experiments lead. > > >>>>>>> -- Norbert Wiener > > >>>>>>> > > >>>>>>> https://www.cse.buffalo.edu/~knepley/ > > >>>>>>> > > >>>>>>> > > >>>>>> > > >>>>>> > > >>>>>> -- > > >>>>>> Ivan VOZNYUK > > >>>>>> PhD in Computational Electromagnetics > > >>>>>> +33 (0)6.95.87.04.55 > > >>>>>> My webpage > > >>>>>> My LinkedIn > > >>>>>> > > >>>>> > > >>>>> > > >>>>> -- > > >>>>> What most experimenters take for granted before they begin their > > >>>>> experiments is infinitely more interesting than any results to > which their > > >>>>> experiments lead. > > >>>>> -- Norbert Wiener > > >>>>> > > >>>>> https://www.cse.buffalo.edu/~knepley/ > > >>>>> > > >>>>> > > >>>> > > >>>> > > >>>> -- > > >>>> Ivan VOZNYUK > > >>>> PhD in Computational Electromagnetics > > >>>> +33 (0)6.95.87.04.55 > > >>>> My webpage > > >>>> My LinkedIn > > >>>> > > >>> > > >>> > > >>> -- > > >>> What most experimenters take for granted before they begin their > > >>> experiments is infinitely more interesting than any results to which > their > > >>> experiments lead. > > >>> -- Norbert Wiener > > >>> > > >>> https://www.cse.buffalo.edu/~knepley/ > > >>> > > >>> > > >> > > > > > > -- Ivan VOZNYUK PhD in Computational Electromagnetics +33 (0)6.95.87.04.55 My webpage My LinkedIn -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri Nov 16 12:10:08 2018 From: balay at mcs.anl.gov (Balay, Satish) Date: Fri, 16 Nov 2018 18:10:08 +0000 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: On Fri, 16 Nov 2018, Ivan Voznyuk via petsc-users wrote: > Hi Satish, > Thanks for your reply. > > Bad news... I tested 2 solutions that you proposed, none has worked. > > 1. --with-blaslapack-dir=/opt/intel/mkl > --with-mkl_pardiso-dir=/opt/intel/mkl installed well, without any problems. > However, the code is still turning in sequential way. Are you using Intel compilers? please send configure.log for this. Satish > 2. When I changed -lmkl_sequential to -lmkl_intel_thread -liomp, he at > first did not find the liomp, so I had to create a symbolic link of libiomp5.so > to /lib. > At the launching of the .py code I had to go with: > export > LD_PRELOAD=/opt/intel/mkl/lib/intel64/libmkl_core.so:/opt/intel/mkl/lib/intel64/libmkl_sequential.so > and > export LD_LIBRARY_PATH=/opt/petsc/petsc1/arch-linux2-c-debug/lib/ > > But still it does not solve the given problem and code is still running > sequentially... > > May be you have some other ideas? > > Thanks, > Ivan > > > > > On Fri, Nov 16, 2018 at 6:11 PM Balay, Satish wrote: > > > Yes PETSc prefers sequential MKL - as MPI handles parallelism. > > > > One way to trick petsc configure to use threaded MKL is to enable pardiso. > > i.e: > > > > --with-blaslapack-dir=/opt/intel/mkl --with-mkl_pardiso-dir=/opt/intel/mkl > > > > > > http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2018/11/15/configure_master_arch-pardiso_grind.log > > > > BLAS/LAPACK: -Wl,-rpath,/soft/com/packages/intel/16/u3/mkl/lib/intel64 > > -L/soft/com/packages/intel/16/u3/mkl/lib/intel64 -lmkl_intel_lp64 > > -lmkl_core -lmkl_intel_thread -liomp5 -ldl -lpthread > > > > Or you can manually specify the correct MKL library list [with > > threading] via --with-blaslapack-lib option. > > > > Satish > > > > On Fri, 16 Nov 2018, Ivan Voznyuk via petsc-users wrote: > > > > > Hi, > > > You were totally right: no miracle, parallelization does come from > > > multithreading. We checked Option 1/: played with OMP_NUM_THREADS=1 it > > > changed computational time. > > > > > > So, I reinstalled everything (starting with Ubuntu ending with petsc) and > > > configured the following things: > > > > > > - installed system's ompenmpi > > > - installed Intel MKL Blas / Lapack > > > - configured PETSC as ./configure --with-cc=mpicc --with-fc=mpif90 > > > --with-cxx=mpicxx --with-blas-lapack-dir=/opt/intel/mkl/lib/intel64 > > > --download-scalapack --download-mumps --with-hwloc --with-shared > > > --with-openmp=1 --with-pthread=1 --with-scalar-type=complex > > > hoping that it would take into account blas multithreading > > > - installed petsc4py > > > > > > However, I do not get any parallelization... > > > What I tried to do so far unsuccessfully : > > > - play with OMP_NUM_THREADS > > > - reinstall the system > > > - ldd PETSc.cpython-35m-x86_64-linux-gnu.so yields lld_result.txt (here > > > attached) > > > I noted that libmkl_sequential.so library there. Do you think this is > > > normal? > > > - I found a similar problem reported here: > > > https://lists.mcs.anl.gov/pipermail/petsc-users/2016-March/028803.html > > To > > > solve this problem, developers recommended to replace -lmkl_sequential to > > > -lmkl_intel_thread options in PETSC_ARCH/lib/conf/petscvariables. > > However, > > > I did not find something that would be named like this (it might be a > > > change of version) > > > - Anyway, I replaced lmkl_sequential to lmkl_intel_thread in every file > > of > > > PETSC, but it changed nothing. > > > > > > As a result, in the new make.log (here attached ) I have a parameter > > > #define PETSC_HAVE_LIBMKL_SEQUENTIAL 1 and option -lmkl_sequential > > > > > > Do you have any idea of what I should change in the initial options in > > > order to obtain the blas multithreding parallelization? > > > > > > Thanks a lot for your help! > > > > > > Ivan > > > > > > > > > > > > > > > > > > > > > On Fri, Nov 16, 2018 at 1:25 AM Dave May > > wrote: > > > > > > > > > > > > > > > On Thu, 15 Nov 2018 at 17:44, Ivan via petsc-users < > > > > petsc-users at mcs.anl.gov> wrote: > > > > > > > >> Hi Stefano, > > > >> > > > >> In fact, yes, we look at the htop output (and the resulting > > computational > > > >> time ofc). > > > >> > > > >> In our code we use MUMPS, which indeed depends on blas / lapack. So I > > > >> think this might be it! > > > >> > > > >> I will definetely check it (I mean the difference between our MUMPS, > > > >> blas, lapack). > > > >> > > > >> If you have an idea of how we can verify on his PC that the source of > > his > > > >> parallelization does come from BLAS, please do not hesitate to tell > > me! > > > >> > > > > > > > > Option 1/ > > > > * Set this environment variable > > > > export OMP_NUM_THREADS=1 > > > > * Re-run your "parallel" test. > > > > * If the performance differs (job runs slower) compared with your > > previous > > > > run where you inferred parallelism was being employed, you can safely > > > > assume that the parallelism observed comes from threads > > > > > > > > Option 2/ > > > > * Re-configure PETSc to use a known BLAS implementation which does not > > > > support threads > > > > * Re-compile PETSc > > > > * Re-run your parallel test > > > > * If the performance differs (job runs slower) compared with your > > previous > > > > run where you inferred parallelism was being employed, you can safely > > > > assume that the parallelism observed comes from threads > > > > > > > > Option 3/ > > > > * Use a PC which does not depend on BLAS at all, > > > > e.g. -pc_type jacobi -pc_type bjacobi > > > > * If the performance differs (job runs slower) compared with your > > previous > > > > run where you inferred parallelism was being employed, you can safely > > > > assume that the parallelism observed comes from BLAS + threads > > > > > > > > > > > > > > > >> Thanks! > > > >> > > > >> Ivan > > > >> On 15/11/2018 18:24, Stefano Zampini wrote: > > > >> > > > >> If you say your program is parallel by just looking at the output from > > > >> the top command, you are probably linking against a multithreaded blas > > > >> library > > > >> > > > >> Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < > > > >> petsc-users at mcs.anl.gov> ha scritto: > > > >> > > > >>> On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk < > > > >>> ivan.voznyuk.work at gmail.com> wrote: > > > >>> > > > >>>> Hi Matthew, > > > >>>> > > > >>>> Does it mean that by using just command python3 simple_code.py > > (without > > > >>>> mpiexec) you *cannot* obtain a parallel execution? > > > >>>> > > > >>> > > > >>> As I wrote before, its not impossible. You could be directly calling > > > >>> PMI, but I do not think you are doing that. > > > >>> > > > >>> > > > >>>> It s been 5 days we are trying to understand with my colleague how > > he > > > >>>> managed to do so. > > > >>>> It means that by using simply python3 simple_code.py he gets 8 > > > >>>> processors workiing. > > > >>>> By the way, we wrote in his code few lines: > > > >>>> rank = PETSc.COMM_WORLD.Get_rank() > > > >>>> size = PETSc.COMM_WORLD.Get_size() > > > >>>> and we got rank = 0, size = 1 > > > >>>> > > > >>> > > > >>> This is MPI telling you that you are only running on 1 processes. > > > >>> > > > >>> > > > >>>> However, we compilator arrives to KSP.solve(), somehow it turns on 8 > > > >>>> processors. > > > >>>> > > > >>> > > > >>> Why do you think its running on 8 processes? > > > >>> > > > >>> > > > >>>> This problem is solved on his PC in 5-8 sec (in parallel, using > > *python3 > > > >>>> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but > > with > > > >>>> the same command *python3 simple_code.py*) > > > >>>> > > > >>> > > > >>> I think its much more likely that there are differences in the solver > > > >>> (use -ksp_view to see exactly what solver was used), then > > > >>> to think it is parallelism. Moreover, you would never ever ever see > > that > > > >>> much speedup on a laptop since all these computations > > > >>> are bandwidth limited. > > > >>> > > > >>> Thanks, > > > >>> > > > >>> Matt > > > >>> > > > >>> > > > >>>> So, conclusion is that on his computer this code works in the same > > way > > > >>>> as scipy: all the code is executed in sequantial mode, but when it > > comes to > > > >>>> solution of system of linear equations, it runs on all available > > > >>>> processors. All this with just running python3 my_code.py (without > > any > > > >>>> mpi-smth) > > > >>>> > > > >>>> Is it an exception / abnormal behavior? I mean, is it something > > > >>>> irregular that you, developers, have never seen? > > > >>>> > > > >>>> Thanks and have a good evening! > > > >>>> Ivan > > > >>>> > > > >>>> P.S. I don't think I know the answer regarding Scipy... > > > >>>> > > > >>>> > > > >>>> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley > > > >>>> wrote: > > > >>>> > > > >>>>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < > > > >>>>> ivan.voznyuk.work at gmail.com> wrote: > > > >>>>> > > > >>>>>> Hi Matthew, > > > >>>>>> Thanks for your reply! > > > >>>>>> > > > >>>>>> Let me precise what I mean by defining few questions: > > > >>>>>> > > > >>>>>> 1. In order to obtain a parallel execution of simple_code.py, do I > > > >>>>>> need to go with mpiexec python3 simple_code.py, or I can just > > launch > > > >>>>>> python3 simple_code.py? > > > >>>>>> > > > >>>>> > > > >>>>> mpiexec -n 2 python3 simple_code.py > > > >>>>> > > > >>>>> > > > >>>>>> 2. This simple_code.py consists of 2 parts: a) preparation of > > matrix > > > >>>>>> b) solving the system of linear equations with PETSc. If I launch > > mpirun > > > >>>>>> (or mpiexec) -np 8 python3 simple_code.py, I suppose that I will > > basically > > > >>>>>> obtain 8 matrices and 8 systems to solve. However, I need to > > prepare only > > > >>>>>> one matrix, but launch this code in parallel on 8 processors. > > > >>>>>> > > > >>>>> > > > >>>>> When you create the Mat object, you give it a communicator (here > > > >>>>> PETSC_COMM_WORLD). That allows us to distribute the data. This is > > all > > > >>>>> covered extensively in the manual and the online tutorials, as > > well as the > > > >>>>> example code. > > > >>>>> > > > >>>>> > > > >>>>>> In fact, here attached you will find a similar code > > (scipy_code.py) > > > >>>>>> with only one difference: the system of linear equations is > > solved with > > > >>>>>> scipy. So when I solve it, I can clearly see that the solution is > > obtained > > > >>>>>> in a parallel way. However, I do not use the command mpirun (or > > mpiexec). I > > > >>>>>> just go with python3 scipy_code.py. > > > >>>>>> > > > >>>>> > > > >>>>> Why do you think its running in parallel? > > > >>>>> > > > >>>>> Thanks, > > > >>>>> > > > >>>>> Matt > > > >>>>> > > > >>>>> > > > >>>>>> In this case, the first part (creation of the sparse matrix) is > > not > > > >>>>>> parallel, whereas the solution of system is found in a parallel > > way. > > > >>>>>> So my question is, Do you think that it s possible to have the > > same > > > >>>>>> behavior with PETSC? And what do I need for this? > > > >>>>>> > > > >>>>>> I am asking this because for my colleague it worked! It means > > that he > > > >>>>>> launches the simple_code.py on his computer using the command > > python3 > > > >>>>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he > > obtains a > > > >>>>>> parallel execution of the same code. > > > >>>>>> > > > >>>>>> Thanks for your help! > > > >>>>>> Ivan > > > >>>>>> > > > >>>>>> > > > >>>>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley < > > knepley at gmail.com> > > > >>>>>> wrote: > > > >>>>>> > > > >>>>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < > > > >>>>>>> petsc-users at mcs.anl.gov> wrote: > > > >>>>>>> > > > >>>>>>>> Dear PETSC community, > > > >>>>>>>> > > > >>>>>>>> I have a question regarding the parallel execution of petsc4py. > > > >>>>>>>> > > > >>>>>>>> I have a simple code (here attached simple_code.py) which > > solves a > > > >>>>>>>> system of linear equations Ax=b using petsc4py. To execute it, > > I use the > > > >>>>>>>> command python3 simple_code.py which yields a sequential > > performance. With > > > >>>>>>>> a colleague of my, we launched this code on his computer, and > > this time the > > > >>>>>>>> execution was in parallel. Although, he used the same command > > python3 > > > >>>>>>>> simple_code.py (without mpirun, neither mpiexec). > > > >>>>>>>> > > > >>>>>>> I am not sure what you mean. To run MPI programs in parallel, you > > > >>>>>>> need a launcher like mpiexec or mpirun. There are Python > > programs (like > > > >>>>>>> nemesis) that use the launcher API directly (called PMI), but > > that is not > > > >>>>>>> part of petsc4py. > > > >>>>>>> > > > >>>>>>> Thanks, > > > >>>>>>> > > > >>>>>>> Matt > > > >>>>>>> > > > >>>>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, > > PETSc > > > >>>>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in > > virtualenv > > > >>>>>>>> > > > >>>>>>>> In order to parallelize it, I have already tried: > > > >>>>>>>> - use 2 different PCs > > > >>>>>>>> - use Ubuntu 16.04, 18.04 > > > >>>>>>>> - use different architectures (arch-linux2-c-debug, > > > >>>>>>>> linux-gnu-c-debug, etc) > > > >>>>>>>> - ofc use different configurations (my present config can be > > found > > > >>>>>>>> in make.log that I attached here) > > > >>>>>>>> - mpi from mpich, openmpi > > > >>>>>>>> > > > >>>>>>>> Nothing worked. > > > >>>>>>>> > > > >>>>>>>> Do you have any ideas? > > > >>>>>>>> > > > >>>>>>>> Thanks and have a good day, > > > >>>>>>>> Ivan > > > >>>>>>>> > > > >>>>>>>> -- > > > >>>>>>>> Ivan VOZNYUK > > > >>>>>>>> PhD in Computational Electromagnetics > > > >>>>>>>> > > > >>>>>>> > > > >>>>>>> > > > >>>>>>> -- > > > >>>>>>> What most experimenters take for granted before they begin their > > > >>>>>>> experiments is infinitely more interesting than any results to > > which their > > > >>>>>>> experiments lead. > > > >>>>>>> -- Norbert Wiener > > > >>>>>>> > > > >>>>>>> https://www.cse.buffalo.edu/~knepley/ > > > >>>>>>> > > > >>>>>>> > > > >>>>>> > > > >>>>>> > > > >>>>>> -- > > > >>>>>> Ivan VOZNYUK > > > >>>>>> PhD in Computational Electromagnetics > > > >>>>>> +33 (0)6.95.87.04.55 > > > >>>>>> My webpage > > > >>>>>> My LinkedIn > > > >>>>>> > > > >>>>> > > > >>>>> > > > >>>>> -- > > > >>>>> What most experimenters take for granted before they begin their > > > >>>>> experiments is infinitely more interesting than any results to > > which their > > > >>>>> experiments lead. > > > >>>>> -- Norbert Wiener > > > >>>>> > > > >>>>> https://www.cse.buffalo.edu/~knepley/ > > > >>>>> > > > >>>>> > > > >>>> > > > >>>> > > > >>>> -- > > > >>>> Ivan VOZNYUK > > > >>>> PhD in Computational Electromagnetics > > > >>>> +33 (0)6.95.87.04.55 > > > >>>> My webpage > > > >>>> My LinkedIn > > > >>>> > > > >>> > > > >>> > > > >>> -- > > > >>> What most experimenters take for granted before they begin their > > > >>> experiments is infinitely more interesting than any results to which > > their > > > >>> experiments lead. > > > >>> -- Norbert Wiener > > > >>> > > > >>> https://www.cse.buffalo.edu/~knepley/ > > > >>> > > > >>> > > > >> > > > > > > > > > > > > From balay at mcs.anl.gov Fri Nov 16 12:13:36 2018 From: balay at mcs.anl.gov (Balay, Satish) Date: Fri, 16 Nov 2018 18:13:36 +0000 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: Likely this build is with gnu compilers. https://software.intel.com/en-us/articles/intel-mkl-link-line-advisor/ The above suggests - the following might work: '--with-blaslapack-lib=-L/opt/intel/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_gnu_thread -lmkl_core -lgomp -lpthread -lm -ldl' Satish On Fri, 16 Nov 2018, Balay, Satish via petsc-users wrote: > On Fri, 16 Nov 2018, Ivan Voznyuk via petsc-users wrote: > > > Hi Satish, > > Thanks for your reply. > > > > Bad news... I tested 2 solutions that you proposed, none has worked. > > > > 1. --with-blaslapack-dir=/opt/intel/mkl > > --with-mkl_pardiso-dir=/opt/intel/mkl installed well, without any problems. > > However, the code is still turning in sequential way. > > Are you using Intel compilers? > > please send configure.log for this. > > Satish > > > 2. When I changed -lmkl_sequential to -lmkl_intel_thread -liomp, he at > > first did not find the liomp, so I had to create a symbolic link of libiomp5.so > > to /lib. > > At the launching of the .py code I had to go with: > > export > > LD_PRELOAD=/opt/intel/mkl/lib/intel64/libmkl_core.so:/opt/intel/mkl/lib/intel64/libmkl_sequential.so > > and > > export LD_LIBRARY_PATH=/opt/petsc/petsc1/arch-linux2-c-debug/lib/ > > > > But still it does not solve the given problem and code is still running > > sequentially... > > > > May be you have some other ideas? > > > > Thanks, > > Ivan > > > > > > > > > > On Fri, Nov 16, 2018 at 6:11 PM Balay, Satish wrote: > > > > > Yes PETSc prefers sequential MKL - as MPI handles parallelism. > > > > > > One way to trick petsc configure to use threaded MKL is to enable pardiso. > > > i.e: > > > > > > --with-blaslapack-dir=/opt/intel/mkl --with-mkl_pardiso-dir=/opt/intel/mkl > > > > > > > > > http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2018/11/15/configure_master_arch-pardiso_grind.log > > > > > > BLAS/LAPACK: -Wl,-rpath,/soft/com/packages/intel/16/u3/mkl/lib/intel64 > > > -L/soft/com/packages/intel/16/u3/mkl/lib/intel64 -lmkl_intel_lp64 > > > -lmkl_core -lmkl_intel_thread -liomp5 -ldl -lpthread > > > > > > Or you can manually specify the correct MKL library list [with > > > threading] via --with-blaslapack-lib option. > > > > > > Satish > > > > > > On Fri, 16 Nov 2018, Ivan Voznyuk via petsc-users wrote: > > > > > > > Hi, > > > > You were totally right: no miracle, parallelization does come from > > > > multithreading. We checked Option 1/: played with OMP_NUM_THREADS=1 it > > > > changed computational time. > > > > > > > > So, I reinstalled everything (starting with Ubuntu ending with petsc) and > > > > configured the following things: > > > > > > > > - installed system's ompenmpi > > > > - installed Intel MKL Blas / Lapack > > > > - configured PETSC as ./configure --with-cc=mpicc --with-fc=mpif90 > > > > --with-cxx=mpicxx --with-blas-lapack-dir=/opt/intel/mkl/lib/intel64 > > > > --download-scalapack --download-mumps --with-hwloc --with-shared > > > > --with-openmp=1 --with-pthread=1 --with-scalar-type=complex > > > > hoping that it would take into account blas multithreading > > > > - installed petsc4py > > > > > > > > However, I do not get any parallelization... > > > > What I tried to do so far unsuccessfully : > > > > - play with OMP_NUM_THREADS > > > > - reinstall the system > > > > - ldd PETSc.cpython-35m-x86_64-linux-gnu.so yields lld_result.txt (here > > > > attached) > > > > I noted that libmkl_sequential.so library there. Do you think this is > > > > normal? > > > > - I found a similar problem reported here: > > > > https://lists.mcs.anl.gov/pipermail/petsc-users/2016-March/028803.html > > > To > > > > solve this problem, developers recommended to replace -lmkl_sequential to > > > > -lmkl_intel_thread options in PETSC_ARCH/lib/conf/petscvariables. > > > However, > > > > I did not find something that would be named like this (it might be a > > > > change of version) > > > > - Anyway, I replaced lmkl_sequential to lmkl_intel_thread in every file > > > of > > > > PETSC, but it changed nothing. > > > > > > > > As a result, in the new make.log (here attached ) I have a parameter > > > > #define PETSC_HAVE_LIBMKL_SEQUENTIAL 1 and option -lmkl_sequential > > > > > > > > Do you have any idea of what I should change in the initial options in > > > > order to obtain the blas multithreding parallelization? > > > > > > > > Thanks a lot for your help! > > > > > > > > Ivan > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Fri, Nov 16, 2018 at 1:25 AM Dave May > > > wrote: > > > > > > > > > > > > > > > > > > > On Thu, 15 Nov 2018 at 17:44, Ivan via petsc-users < > > > > > petsc-users at mcs.anl.gov> wrote: > > > > > > > > > >> Hi Stefano, > > > > >> > > > > >> In fact, yes, we look at the htop output (and the resulting > > > computational > > > > >> time ofc). > > > > >> > > > > >> In our code we use MUMPS, which indeed depends on blas / lapack. So I > > > > >> think this might be it! > > > > >> > > > > >> I will definetely check it (I mean the difference between our MUMPS, > > > > >> blas, lapack). > > > > >> > > > > >> If you have an idea of how we can verify on his PC that the source of > > > his > > > > >> parallelization does come from BLAS, please do not hesitate to tell > > > me! > > > > >> > > > > > > > > > > Option 1/ > > > > > * Set this environment variable > > > > > export OMP_NUM_THREADS=1 > > > > > * Re-run your "parallel" test. > > > > > * If the performance differs (job runs slower) compared with your > > > previous > > > > > run where you inferred parallelism was being employed, you can safely > > > > > assume that the parallelism observed comes from threads > > > > > > > > > > Option 2/ > > > > > * Re-configure PETSc to use a known BLAS implementation which does not > > > > > support threads > > > > > * Re-compile PETSc > > > > > * Re-run your parallel test > > > > > * If the performance differs (job runs slower) compared with your > > > previous > > > > > run where you inferred parallelism was being employed, you can safely > > > > > assume that the parallelism observed comes from threads > > > > > > > > > > Option 3/ > > > > > * Use a PC which does not depend on BLAS at all, > > > > > e.g. -pc_type jacobi -pc_type bjacobi > > > > > * If the performance differs (job runs slower) compared with your > > > previous > > > > > run where you inferred parallelism was being employed, you can safely > > > > > assume that the parallelism observed comes from BLAS + threads > > > > > > > > > > > > > > > > > > > >> Thanks! > > > > >> > > > > >> Ivan > > > > >> On 15/11/2018 18:24, Stefano Zampini wrote: > > > > >> > > > > >> If you say your program is parallel by just looking at the output from > > > > >> the top command, you are probably linking against a multithreaded blas > > > > >> library > > > > >> > > > > >> Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < > > > > >> petsc-users at mcs.anl.gov> ha scritto: > > > > >> > > > > >>> On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk < > > > > >>> ivan.voznyuk.work at gmail.com> wrote: > > > > >>> > > > > >>>> Hi Matthew, > > > > >>>> > > > > >>>> Does it mean that by using just command python3 simple_code.py > > > (without > > > > >>>> mpiexec) you *cannot* obtain a parallel execution? > > > > >>>> > > > > >>> > > > > >>> As I wrote before, its not impossible. You could be directly calling > > > > >>> PMI, but I do not think you are doing that. > > > > >>> > > > > >>> > > > > >>>> It s been 5 days we are trying to understand with my colleague how > > > he > > > > >>>> managed to do so. > > > > >>>> It means that by using simply python3 simple_code.py he gets 8 > > > > >>>> processors workiing. > > > > >>>> By the way, we wrote in his code few lines: > > > > >>>> rank = PETSc.COMM_WORLD.Get_rank() > > > > >>>> size = PETSc.COMM_WORLD.Get_size() > > > > >>>> and we got rank = 0, size = 1 > > > > >>>> > > > > >>> > > > > >>> This is MPI telling you that you are only running on 1 processes. > > > > >>> > > > > >>> > > > > >>>> However, we compilator arrives to KSP.solve(), somehow it turns on 8 > > > > >>>> processors. > > > > >>>> > > > > >>> > > > > >>> Why do you think its running on 8 processes? > > > > >>> > > > > >>> > > > > >>>> This problem is solved on his PC in 5-8 sec (in parallel, using > > > *python3 > > > > >>>> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but > > > with > > > > >>>> the same command *python3 simple_code.py*) > > > > >>>> > > > > >>> > > > > >>> I think its much more likely that there are differences in the solver > > > > >>> (use -ksp_view to see exactly what solver was used), then > > > > >>> to think it is parallelism. Moreover, you would never ever ever see > > > that > > > > >>> much speedup on a laptop since all these computations > > > > >>> are bandwidth limited. > > > > >>> > > > > >>> Thanks, > > > > >>> > > > > >>> Matt > > > > >>> > > > > >>> > > > > >>>> So, conclusion is that on his computer this code works in the same > > > way > > > > >>>> as scipy: all the code is executed in sequantial mode, but when it > > > comes to > > > > >>>> solution of system of linear equations, it runs on all available > > > > >>>> processors. All this with just running python3 my_code.py (without > > > any > > > > >>>> mpi-smth) > > > > >>>> > > > > >>>> Is it an exception / abnormal behavior? I mean, is it something > > > > >>>> irregular that you, developers, have never seen? > > > > >>>> > > > > >>>> Thanks and have a good evening! > > > > >>>> Ivan > > > > >>>> > > > > >>>> P.S. I don't think I know the answer regarding Scipy... > > > > >>>> > > > > >>>> > > > > >>>> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley > > > > >>>> wrote: > > > > >>>> > > > > >>>>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < > > > > >>>>> ivan.voznyuk.work at gmail.com> wrote: > > > > >>>>> > > > > >>>>>> Hi Matthew, > > > > >>>>>> Thanks for your reply! > > > > >>>>>> > > > > >>>>>> Let me precise what I mean by defining few questions: > > > > >>>>>> > > > > >>>>>> 1. In order to obtain a parallel execution of simple_code.py, do I > > > > >>>>>> need to go with mpiexec python3 simple_code.py, or I can just > > > launch > > > > >>>>>> python3 simple_code.py? > > > > >>>>>> > > > > >>>>> > > > > >>>>> mpiexec -n 2 python3 simple_code.py > > > > >>>>> > > > > >>>>> > > > > >>>>>> 2. This simple_code.py consists of 2 parts: a) preparation of > > > matrix > > > > >>>>>> b) solving the system of linear equations with PETSc. If I launch > > > mpirun > > > > >>>>>> (or mpiexec) -np 8 python3 simple_code.py, I suppose that I will > > > basically > > > > >>>>>> obtain 8 matrices and 8 systems to solve. However, I need to > > > prepare only > > > > >>>>>> one matrix, but launch this code in parallel on 8 processors. > > > > >>>>>> > > > > >>>>> > > > > >>>>> When you create the Mat object, you give it a communicator (here > > > > >>>>> PETSC_COMM_WORLD). That allows us to distribute the data. This is > > > all > > > > >>>>> covered extensively in the manual and the online tutorials, as > > > well as the > > > > >>>>> example code. > > > > >>>>> > > > > >>>>> > > > > >>>>>> In fact, here attached you will find a similar code > > > (scipy_code.py) > > > > >>>>>> with only one difference: the system of linear equations is > > > solved with > > > > >>>>>> scipy. So when I solve it, I can clearly see that the solution is > > > obtained > > > > >>>>>> in a parallel way. However, I do not use the command mpirun (or > > > mpiexec). I > > > > >>>>>> just go with python3 scipy_code.py. > > > > >>>>>> > > > > >>>>> > > > > >>>>> Why do you think its running in parallel? > > > > >>>>> > > > > >>>>> Thanks, > > > > >>>>> > > > > >>>>> Matt > > > > >>>>> > > > > >>>>> > > > > >>>>>> In this case, the first part (creation of the sparse matrix) is > > > not > > > > >>>>>> parallel, whereas the solution of system is found in a parallel > > > way. > > > > >>>>>> So my question is, Do you think that it s possible to have the > > > same > > > > >>>>>> behavior with PETSC? And what do I need for this? > > > > >>>>>> > > > > >>>>>> I am asking this because for my colleague it worked! It means > > > that he > > > > >>>>>> launches the simple_code.py on his computer using the command > > > python3 > > > > >>>>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he > > > obtains a > > > > >>>>>> parallel execution of the same code. > > > > >>>>>> > > > > >>>>>> Thanks for your help! > > > > >>>>>> Ivan > > > > >>>>>> > > > > >>>>>> > > > > >>>>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley < > > > knepley at gmail.com> > > > > >>>>>> wrote: > > > > >>>>>> > > > > >>>>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < > > > > >>>>>>> petsc-users at mcs.anl.gov> wrote: > > > > >>>>>>> > > > > >>>>>>>> Dear PETSC community, > > > > >>>>>>>> > > > > >>>>>>>> I have a question regarding the parallel execution of petsc4py. > > > > >>>>>>>> > > > > >>>>>>>> I have a simple code (here attached simple_code.py) which > > > solves a > > > > >>>>>>>> system of linear equations Ax=b using petsc4py. To execute it, > > > I use the > > > > >>>>>>>> command python3 simple_code.py which yields a sequential > > > performance. With > > > > >>>>>>>> a colleague of my, we launched this code on his computer, and > > > this time the > > > > >>>>>>>> execution was in parallel. Although, he used the same command > > > python3 > > > > >>>>>>>> simple_code.py (without mpirun, neither mpiexec). > > > > >>>>>>>> > > > > >>>>>>> I am not sure what you mean. To run MPI programs in parallel, you > > > > >>>>>>> need a launcher like mpiexec or mpirun. There are Python > > > programs (like > > > > >>>>>>> nemesis) that use the launcher API directly (called PMI), but > > > that is not > > > > >>>>>>> part of petsc4py. > > > > >>>>>>> > > > > >>>>>>> Thanks, > > > > >>>>>>> > > > > >>>>>>> Matt > > > > >>>>>>> > > > > >>>>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, > > > PETSc > > > > >>>>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in > > > virtualenv > > > > >>>>>>>> > > > > >>>>>>>> In order to parallelize it, I have already tried: > > > > >>>>>>>> - use 2 different PCs > > > > >>>>>>>> - use Ubuntu 16.04, 18.04 > > > > >>>>>>>> - use different architectures (arch-linux2-c-debug, > > > > >>>>>>>> linux-gnu-c-debug, etc) > > > > >>>>>>>> - ofc use different configurations (my present config can be > > > found > > > > >>>>>>>> in make.log that I attached here) > > > > >>>>>>>> - mpi from mpich, openmpi > > > > >>>>>>>> > > > > >>>>>>>> Nothing worked. > > > > >>>>>>>> > > > > >>>>>>>> Do you have any ideas? > > > > >>>>>>>> > > > > >>>>>>>> Thanks and have a good day, > > > > >>>>>>>> Ivan > > > > >>>>>>>> > > > > >>>>>>>> -- > > > > >>>>>>>> Ivan VOZNYUK > > > > >>>>>>>> PhD in Computational Electromagnetics > > > > >>>>>>>> > > > > >>>>>>> > > > > >>>>>>> > > > > >>>>>>> -- > > > > >>>>>>> What most experimenters take for granted before they begin their > > > > >>>>>>> experiments is infinitely more interesting than any results to > > > which their > > > > >>>>>>> experiments lead. > > > > >>>>>>> -- Norbert Wiener > > > > >>>>>>> > > > > >>>>>>> https://www.cse.buffalo.edu/~knepley/ > > > > >>>>>>> > > > > >>>>>>> > > > > >>>>>> > > > > >>>>>> > > > > >>>>>> -- > > > > >>>>>> Ivan VOZNYUK > > > > >>>>>> PhD in Computational Electromagnetics > > > > >>>>>> +33 (0)6.95.87.04.55 > > > > >>>>>> My webpage > > > > >>>>>> My LinkedIn > > > > >>>>>> > > > > >>>>> > > > > >>>>> > > > > >>>>> -- > > > > >>>>> What most experimenters take for granted before they begin their > > > > >>>>> experiments is infinitely more interesting than any results to > > > which their > > > > >>>>> experiments lead. > > > > >>>>> -- Norbert Wiener > > > > >>>>> > > > > >>>>> https://www.cse.buffalo.edu/~knepley/ > > > > >>>>> > > > > >>>>> > > > > >>>> > > > > >>>> > > > > >>>> -- > > > > >>>> Ivan VOZNYUK > > > > >>>> PhD in Computational Electromagnetics > > > > >>>> +33 (0)6.95.87.04.55 > > > > >>>> My webpage > > > > >>>> My LinkedIn > > > > >>>> > > > > >>> > > > > >>> > > > > >>> -- > > > > >>> What most experimenters take for granted before they begin their > > > > >>> experiments is infinitely more interesting than any results to which > > > their > > > > >>> experiments lead. > > > > >>> -- Norbert Wiener > > > > >>> > > > > >>> https://www.cse.buffalo.edu/~knepley/ > > > > >>> > > > > >>> > > > > >> > > > > > > > > > > > > > > > > > > > From dave.mayhem23 at gmail.com Fri Nov 16 12:18:40 2018 From: dave.mayhem23 at gmail.com (Dave May) Date: Fri, 16 Nov 2018 19:18:40 +0100 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: On Fri, 16 Nov 2018 at 19:02, Ivan Voznyuk wrote: > Hi Satish, > Thanks for your reply. > > Bad news... I tested 2 solutions that you proposed, none has worked. > You don't still have OMP_NUM_THREADS=1 set in your environment do you? Can you print the value of this env variable from within your python code and confirm it's not 1 > > 1. --with-blaslapack-dir=/opt/intel/mkl > --with-mkl_pardiso-dir=/opt/intel/mkl installed well, without any problems. > However, the code is still turning in sequential way. > 2. When I changed -lmkl_sequential to -lmkl_intel_thread -liomp, he at > first did not find the liomp, so I had to create a symbolic link of libiomp5.so > to /lib. > At the launching of the .py code I had to go with: > export > LD_PRELOAD=/opt/intel/mkl/lib/intel64/libmkl_core.so:/opt/intel/mkl/lib/intel64/libmkl_sequential.so > and > export LD_LIBRARY_PATH=/opt/petsc/petsc1/arch-linux2-c-debug/lib/ > > But still it does not solve the given problem and code is still running > sequentially... > > May be you have some other ideas? > > Thanks, > Ivan > > > > > On Fri, Nov 16, 2018 at 6:11 PM Balay, Satish wrote: > >> Yes PETSc prefers sequential MKL - as MPI handles parallelism. >> >> One way to trick petsc configure to use threaded MKL is to enable >> pardiso. i.e: >> >> --with-blaslapack-dir=/opt/intel/mkl --with-mkl_pardiso-dir=/opt/intel/mkl >> >> >> http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2018/11/15/configure_master_arch-pardiso_grind.log >> >> BLAS/LAPACK: -Wl,-rpath,/soft/com/packages/intel/16/u3/mkl/lib/intel64 >> -L/soft/com/packages/intel/16/u3/mkl/lib/intel64 -lmkl_intel_lp64 >> -lmkl_core -lmkl_intel_thread -liomp5 -ldl -lpthread >> >> Or you can manually specify the correct MKL library list [with >> threading] via --with-blaslapack-lib option. >> >> Satish >> >> On Fri, 16 Nov 2018, Ivan Voznyuk via petsc-users wrote: >> >> > Hi, >> > You were totally right: no miracle, parallelization does come from >> > multithreading. We checked Option 1/: played with OMP_NUM_THREADS=1 it >> > changed computational time. >> > >> > So, I reinstalled everything (starting with Ubuntu ending with petsc) >> and >> > configured the following things: >> > >> > - installed system's ompenmpi >> > - installed Intel MKL Blas / Lapack >> > - configured PETSC as ./configure --with-cc=mpicc --with-fc=mpif90 >> > --with-cxx=mpicxx --with-blas-lapack-dir=/opt/intel/mkl/lib/intel64 >> > --download-scalapack --download-mumps --with-hwloc --with-shared >> > --with-openmp=1 --with-pthread=1 --with-scalar-type=complex >> > hoping that it would take into account blas multithreading >> > - installed petsc4py >> > >> > However, I do not get any parallelization... >> > What I tried to do so far unsuccessfully : >> > - play with OMP_NUM_THREADS >> > - reinstall the system >> > - ldd PETSc.cpython-35m-x86_64-linux-gnu.so yields lld_result.txt (here >> > attached) >> > I noted that libmkl_sequential.so library there. Do you think this is >> > normal? >> > - I found a similar problem reported here: >> > https://lists.mcs.anl.gov/pipermail/petsc-users/2016-March/028803.html >> To >> > solve this problem, developers recommended to replace -lmkl_sequential >> to >> > -lmkl_intel_thread options in PETSC_ARCH/lib/conf/petscvariables. >> However, >> > I did not find something that would be named like this (it might be a >> > change of version) >> > - Anyway, I replaced lmkl_sequential to lmkl_intel_thread in every file >> of >> > PETSC, but it changed nothing. >> > >> > As a result, in the new make.log (here attached ) I have a parameter >> > #define PETSC_HAVE_LIBMKL_SEQUENTIAL 1 and option -lmkl_sequential >> > >> > Do you have any idea of what I should change in the initial options in >> > order to obtain the blas multithreding parallelization? >> > >> > Thanks a lot for your help! >> > >> > Ivan >> > >> > >> > >> > >> > >> > >> > On Fri, Nov 16, 2018 at 1:25 AM Dave May >> wrote: >> > >> > > >> > > >> > > On Thu, 15 Nov 2018 at 17:44, Ivan via petsc-users < >> > > petsc-users at mcs.anl.gov> wrote: >> > > >> > >> Hi Stefano, >> > >> >> > >> In fact, yes, we look at the htop output (and the resulting >> computational >> > >> time ofc). >> > >> >> > >> In our code we use MUMPS, which indeed depends on blas / lapack. So I >> > >> think this might be it! >> > >> >> > >> I will definetely check it (I mean the difference between our MUMPS, >> > >> blas, lapack). >> > >> >> > >> If you have an idea of how we can verify on his PC that the source >> of his >> > >> parallelization does come from BLAS, please do not hesitate to tell >> me! >> > >> >> > > >> > > Option 1/ >> > > * Set this environment variable >> > > export OMP_NUM_THREADS=1 >> > > * Re-run your "parallel" test. >> > > * If the performance differs (job runs slower) compared with your >> previous >> > > run where you inferred parallelism was being employed, you can safely >> > > assume that the parallelism observed comes from threads >> > > >> > > Option 2/ >> > > * Re-configure PETSc to use a known BLAS implementation which does not >> > > support threads >> > > * Re-compile PETSc >> > > * Re-run your parallel test >> > > * If the performance differs (job runs slower) compared with your >> previous >> > > run where you inferred parallelism was being employed, you can safely >> > > assume that the parallelism observed comes from threads >> > > >> > > Option 3/ >> > > * Use a PC which does not depend on BLAS at all, >> > > e.g. -pc_type jacobi -pc_type bjacobi >> > > * If the performance differs (job runs slower) compared with your >> previous >> > > run where you inferred parallelism was being employed, you can safely >> > > assume that the parallelism observed comes from BLAS + threads >> > > >> > > >> > > >> > >> Thanks! >> > >> >> > >> Ivan >> > >> On 15/11/2018 18:24, Stefano Zampini wrote: >> > >> >> > >> If you say your program is parallel by just looking at the output >> from >> > >> the top command, you are probably linking against a multithreaded >> blas >> > >> library >> > >> >> > >> Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < >> > >> petsc-users at mcs.anl.gov> ha scritto: >> > >> >> > >>> On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk < >> > >>> ivan.voznyuk.work at gmail.com> wrote: >> > >>> >> > >>>> Hi Matthew, >> > >>>> >> > >>>> Does it mean that by using just command python3 simple_code.py >> (without >> > >>>> mpiexec) you *cannot* obtain a parallel execution? >> > >>>> >> > >>> >> > >>> As I wrote before, its not impossible. You could be directly calling >> > >>> PMI, but I do not think you are doing that. >> > >>> >> > >>> >> > >>>> It s been 5 days we are trying to understand with my colleague how >> he >> > >>>> managed to do so. >> > >>>> It means that by using simply python3 simple_code.py he gets 8 >> > >>>> processors workiing. >> > >>>> By the way, we wrote in his code few lines: >> > >>>> rank = PETSc.COMM_WORLD.Get_rank() >> > >>>> size = PETSc.COMM_WORLD.Get_size() >> > >>>> and we got rank = 0, size = 1 >> > >>>> >> > >>> >> > >>> This is MPI telling you that you are only running on 1 processes. >> > >>> >> > >>> >> > >>>> However, we compilator arrives to KSP.solve(), somehow it turns on >> 8 >> > >>>> processors. >> > >>>> >> > >>> >> > >>> Why do you think its running on 8 processes? >> > >>> >> > >>> >> > >>>> This problem is solved on his PC in 5-8 sec (in parallel, using >> *python3 >> > >>>> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but >> with >> > >>>> the same command *python3 simple_code.py*) >> > >>>> >> > >>> >> > >>> I think its much more likely that there are differences in the >> solver >> > >>> (use -ksp_view to see exactly what solver was used), then >> > >>> to think it is parallelism. Moreover, you would never ever ever see >> that >> > >>> much speedup on a laptop since all these computations >> > >>> are bandwidth limited. >> > >>> >> > >>> Thanks, >> > >>> >> > >>> Matt >> > >>> >> > >>> >> > >>>> So, conclusion is that on his computer this code works in the same >> way >> > >>>> as scipy: all the code is executed in sequantial mode, but when it >> comes to >> > >>>> solution of system of linear equations, it runs on all available >> > >>>> processors. All this with just running python3 my_code.py (without >> any >> > >>>> mpi-smth) >> > >>>> >> > >>>> Is it an exception / abnormal behavior? I mean, is it something >> > >>>> irregular that you, developers, have never seen? >> > >>>> >> > >>>> Thanks and have a good evening! >> > >>>> Ivan >> > >>>> >> > >>>> P.S. I don't think I know the answer regarding Scipy... >> > >>>> >> > >>>> >> > >>>> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley > > >> > >>>> wrote: >> > >>>> >> > >>>>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < >> > >>>>> ivan.voznyuk.work at gmail.com> wrote: >> > >>>>> >> > >>>>>> Hi Matthew, >> > >>>>>> Thanks for your reply! >> > >>>>>> >> > >>>>>> Let me precise what I mean by defining few questions: >> > >>>>>> >> > >>>>>> 1. In order to obtain a parallel execution of simple_code.py, do >> I >> > >>>>>> need to go with mpiexec python3 simple_code.py, or I can just >> launch >> > >>>>>> python3 simple_code.py? >> > >>>>>> >> > >>>>> >> > >>>>> mpiexec -n 2 python3 simple_code.py >> > >>>>> >> > >>>>> >> > >>>>>> 2. This simple_code.py consists of 2 parts: a) preparation of >> matrix >> > >>>>>> b) solving the system of linear equations with PETSc. If I >> launch mpirun >> > >>>>>> (or mpiexec) -np 8 python3 simple_code.py, I suppose that I will >> basically >> > >>>>>> obtain 8 matrices and 8 systems to solve. However, I need to >> prepare only >> > >>>>>> one matrix, but launch this code in parallel on 8 processors. >> > >>>>>> >> > >>>>> >> > >>>>> When you create the Mat object, you give it a communicator (here >> > >>>>> PETSC_COMM_WORLD). That allows us to distribute the data. This is >> all >> > >>>>> covered extensively in the manual and the online tutorials, as >> well as the >> > >>>>> example code. >> > >>>>> >> > >>>>> >> > >>>>>> In fact, here attached you will find a similar code >> (scipy_code.py) >> > >>>>>> with only one difference: the system of linear equations is >> solved with >> > >>>>>> scipy. So when I solve it, I can clearly see that the solution >> is obtained >> > >>>>>> in a parallel way. However, I do not use the command mpirun (or >> mpiexec). I >> > >>>>>> just go with python3 scipy_code.py. >> > >>>>>> >> > >>>>> >> > >>>>> Why do you think its running in parallel? >> > >>>>> >> > >>>>> Thanks, >> > >>>>> >> > >>>>> Matt >> > >>>>> >> > >>>>> >> > >>>>>> In this case, the first part (creation of the sparse matrix) is >> not >> > >>>>>> parallel, whereas the solution of system is found in a parallel >> way. >> > >>>>>> So my question is, Do you think that it s possible to have the >> same >> > >>>>>> behavior with PETSC? And what do I need for this? >> > >>>>>> >> > >>>>>> I am asking this because for my colleague it worked! It means >> that he >> > >>>>>> launches the simple_code.py on his computer using the command >> python3 >> > >>>>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he >> obtains a >> > >>>>>> parallel execution of the same code. >> > >>>>>> >> > >>>>>> Thanks for your help! >> > >>>>>> Ivan >> > >>>>>> >> > >>>>>> >> > >>>>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley < >> knepley at gmail.com> >> > >>>>>> wrote: >> > >>>>>> >> > >>>>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >> > >>>>>>> petsc-users at mcs.anl.gov> wrote: >> > >>>>>>> >> > >>>>>>>> Dear PETSC community, >> > >>>>>>>> >> > >>>>>>>> I have a question regarding the parallel execution of petsc4py. >> > >>>>>>>> >> > >>>>>>>> I have a simple code (here attached simple_code.py) which >> solves a >> > >>>>>>>> system of linear equations Ax=b using petsc4py. To execute it, >> I use the >> > >>>>>>>> command python3 simple_code.py which yields a sequential >> performance. With >> > >>>>>>>> a colleague of my, we launched this code on his computer, and >> this time the >> > >>>>>>>> execution was in parallel. Although, he used the same command >> python3 >> > >>>>>>>> simple_code.py (without mpirun, neither mpiexec). >> > >>>>>>>> >> > >>>>>>> I am not sure what you mean. To run MPI programs in parallel, >> you >> > >>>>>>> need a launcher like mpiexec or mpirun. There are Python >> programs (like >> > >>>>>>> nemesis) that use the launcher API directly (called PMI), but >> that is not >> > >>>>>>> part of petsc4py. >> > >>>>>>> >> > >>>>>>> Thanks, >> > >>>>>>> >> > >>>>>>> Matt >> > >>>>>>> >> > >>>>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, >> PETSc >> > >>>>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in >> virtualenv >> > >>>>>>>> >> > >>>>>>>> In order to parallelize it, I have already tried: >> > >>>>>>>> - use 2 different PCs >> > >>>>>>>> - use Ubuntu 16.04, 18.04 >> > >>>>>>>> - use different architectures (arch-linux2-c-debug, >> > >>>>>>>> linux-gnu-c-debug, etc) >> > >>>>>>>> - ofc use different configurations (my present config can be >> found >> > >>>>>>>> in make.log that I attached here) >> > >>>>>>>> - mpi from mpich, openmpi >> > >>>>>>>> >> > >>>>>>>> Nothing worked. >> > >>>>>>>> >> > >>>>>>>> Do you have any ideas? >> > >>>>>>>> >> > >>>>>>>> Thanks and have a good day, >> > >>>>>>>> Ivan >> > >>>>>>>> >> > >>>>>>>> -- >> > >>>>>>>> Ivan VOZNYUK >> > >>>>>>>> PhD in Computational Electromagnetics >> > >>>>>>>> >> > >>>>>>> >> > >>>>>>> >> > >>>>>>> -- >> > >>>>>>> What most experimenters take for granted before they begin their >> > >>>>>>> experiments is infinitely more interesting than any results to >> which their >> > >>>>>>> experiments lead. >> > >>>>>>> -- Norbert Wiener >> > >>>>>>> >> > >>>>>>> https://www.cse.buffalo.edu/~knepley/ >> > >>>>>>> >> > >>>>>>> >> > >>>>>> >> > >>>>>> >> > >>>>>> -- >> > >>>>>> Ivan VOZNYUK >> > >>>>>> PhD in Computational Electromagnetics >> > >>>>>> +33 (0)6.95.87.04.55 >> > >>>>>> My webpage >> > >>>>>> My LinkedIn >> > >>>>>> >> > >>>>> >> > >>>>> >> > >>>>> -- >> > >>>>> What most experimenters take for granted before they begin their >> > >>>>> experiments is infinitely more interesting than any results to >> which their >> > >>>>> experiments lead. >> > >>>>> -- Norbert Wiener >> > >>>>> >> > >>>>> https://www.cse.buffalo.edu/~knepley/ >> > >>>>> >> > >>>>> >> > >>>> >> > >>>> >> > >>>> -- >> > >>>> Ivan VOZNYUK >> > >>>> PhD in Computational Electromagnetics >> > >>>> +33 (0)6.95.87.04.55 >> > >>>> My webpage >> > >>>> My LinkedIn >> > >>>> >> > >>> >> > >>> >> > >>> -- >> > >>> What most experimenters take for granted before they begin their >> > >>> experiments is infinitely more interesting than any results to >> which their >> > >>> experiments lead. >> > >>> -- Norbert Wiener >> > >>> >> > >>> https://www.cse.buffalo.edu/~knepley/ >> > >>> >> > >>> >> > >> >> > >> > >> >> > > -- > Ivan VOZNYUK > PhD in Computational Electromagnetics > +33 (0)6.95.87.04.55 > My webpage > My LinkedIn > -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Sat Nov 17 13:12:20 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Sat, 17 Nov 2018 13:12:20 -0600 Subject: [petsc-users] Expecting Explanation Message-ID: Hello PETSc developers, In example 6 or vec section I don't understand this line: if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor example only!"); I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. Thanks. Sincerely, Huq -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Nov 17 13:22:16 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Sat, 17 Nov 2018 19:22:16 +0000 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: Message-ID: Note the code in the example /* Open viewer for binary output */ ierr = PetscViewerBinaryOpen(PETSC_COMM_SELF,"input.dat",FILE_MODE_WRITE,&view_out);CHKERRQ(ierr); ierr = PetscViewerBinaryGetDescriptor(view_out,&fd);CHKERRQ(ierr); /* Write binary output */ ierr = PetscBinaryWrite(fd,&m,1,PETSC_INT,PETSC_FALSE);CHKERRQ(ierr); ierr = PetscBinaryWrite(fd,array,m,PETSC_SCALAR,PETSC_FALSE);CHKERRQ(ierr); this means each process is opening the same file for writing (because the argument to PetscViewerBinaryOpen() is PETSC_COMM_SELF) and then all of the processes are writing to this same file (completely uncoordinated). In general this won't work, it will just make a mess of things. This is why this example has the line > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor example only!"); Barry > On Nov 17, 2018, at 1:12 PM, Fazlul Huq via petsc-users wrote: > > Hello PETSc developers, > > In example 6 or vec section I don't understand this line: > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor example only!"); > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. > > Thanks. > > Sincerely, > Huq > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com From huq2090 at gmail.com Sat Nov 17 15:51:10 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Sat, 17 Nov 2018 15:51:10 -0600 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: Message-ID: Thanks for the answer. So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" shouldn't it spit out the error "This is a uniprocessor example only!"? But it didn't do that rather run the code 2/4 times. Thanks. Sincerely, Huq On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq wrote: > Hello PETSc developers, > > In example 6 or vec section I don't understand this line: > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor example > only!"); > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" > or even "mpiexec -n 1 ./ex6" it the code runs properly. > > Thanks. > > Sincerely, > Huq > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From s_g at berkeley.edu Sat Nov 17 16:01:55 2018 From: s_g at berkeley.edu (Sanjay Govindjee) Date: Sat, 17 Nov 2018 14:01:55 -0800 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: Message-ID: <8831e520-42e5-67bf-1b54-de32002e12d0@berkeley.edu> Throws the proper error on my machine: $ ~/petsc-3.10.1/intel/bin/mpirun -np 2 ex6 [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: This is a uniprocessor example only! On 11/17/18 1:51 PM, Fazlul Huq via petsc-users wrote: > Thanks for the answer. > > So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 > ./ex6" shouldn't it spit out > the error "This is a uniprocessor example only!"? > But it didn't do that rather run the code 2/4 times. > > Thanks. > Sincerely, > Huq > > On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq > wrote: > > Hello PETSc developers, > > In example 6 or vec section I don't understand this line: > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor > example only!"); > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 > ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. > > Thanks. > > Sincerely, > Huq > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Sat Nov 17 16:15:55 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Sat, 17 Nov 2018 16:15:55 -0600 Subject: [petsc-users] Expecting Explanation In-Reply-To: <8831e520-42e5-67bf-1b54-de32002e12d0@berkeley.edu> References: <8831e520-42e5-67bf-1b54-de32002e12d0@berkeley.edu> Message-ID: Thanks. For some reason it's not showing on my machine. I am using petsc-3.10.2 Sincerely, Huq On Sat, Nov 17, 2018 at 4:03 PM Sanjay Govindjee via petsc-users < petsc-users at mcs.anl.gov> wrote: > Throws the proper error on my machine: > > $ ~/petsc-3.10.1/intel/bin/mpirun -np 2 ex6 > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: This is a uniprocessor example only! > > On 11/17/18 1:51 PM, Fazlul Huq via petsc-users wrote: > > Thanks for the answer. > > So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" > shouldn't it spit out > the error "This is a uniprocessor example only!"? > But it didn't do that rather run the code 2/4 times. > > Thanks. > Sincerely, > Huq > > On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq wrote: > >> Hello PETSc developers, >> >> In example 6 or vec section I don't understand this line: >> if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor example >> only!"); >> >> I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" >> or even "mpiexec -n 1 ./ex6" it the code runs properly. >> >> Thanks. >> >> Sincerely, >> Huq >> >> -- >> >> Fazlul Huq >> Graduate Research Assistant >> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> University of Illinois at Urbana-Champaign (UIUC) >> E-mail: huq2090 at gmail.com >> > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > > > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Sat Nov 17 16:18:52 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Sat, 17 Nov 2018 16:18:52 -0600 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: <8831e520-42e5-67bf-1b54-de32002e12d0@berkeley.edu> Message-ID: Well. I tried on other machine and it shows error message. The fact is I understand the example. Thanks again. Sincerely, Huq On Sat, Nov 17, 2018 at 4:15 PM Fazlul Huq wrote: > Thanks. > > For some reason it's not showing on my machine. > I am using petsc-3.10.2 > > Sincerely, > Huq > > On Sat, Nov 17, 2018 at 4:03 PM Sanjay Govindjee via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Throws the proper error on my machine: >> >> $ ~/petsc-3.10.1/intel/bin/mpirun -np 2 ex6 >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: This is a uniprocessor example only! >> >> On 11/17/18 1:51 PM, Fazlul Huq via petsc-users wrote: >> >> Thanks for the answer. >> >> So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" >> shouldn't it spit out >> the error "This is a uniprocessor example only!"? >> But it didn't do that rather run the code 2/4 times. >> >> Thanks. >> Sincerely, >> Huq >> >> On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq wrote: >> >>> Hello PETSc developers, >>> >>> In example 6 or vec section I don't understand this line: >>> if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor example >>> only!"); >>> >>> I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 >>> ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. >>> >>> Thanks. >>> >>> Sincerely, >>> Huq >>> >>> -- >>> >>> Fazlul Huq >>> Graduate Research Assistant >>> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>> University of Illinois at Urbana-Champaign (UIUC) >>> E-mail: huq2090 at gmail.com >>> >> >> >> -- >> >> Fazlul Huq >> Graduate Research Assistant >> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> University of Illinois at Urbana-Champaign (UIUC) >> E-mail: huq2090 at gmail.com >> >> >> > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Nov 17 16:53:50 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Sat, 17 Nov 2018 22:53:50 +0000 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: Message-ID: <08F44F1F-497E-4F61-BCFC-945A54751126@anl.gov> > On Nov 17, 2018, at 3:51 PM, Fazlul Huq via petsc-users wrote: > > Thanks for the answer. > > So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" shouldn't it spit out > the error "This is a uniprocessor example only!"? > But it didn't do that rather run the code 2/4 times. This can happen if you use the "wrong" mpiexec; that is a different mpiexec than the one associated with the mpi you built PETSc with. For example if some mpiexec is in your path and you used ./configure --download-mpich Barry > > Thanks. > Sincerely, > Huq > > On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq wrote: > Hello PETSc developers, > > In example 6 or vec section I don't understand this line: > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor example only!"); > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. > > Thanks. > > Sincerely, > Huq > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com From huq2090 at gmail.com Sat Nov 17 17:22:55 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Sat, 17 Nov 2018 17:22:55 -0600 Subject: [petsc-users] Expecting Explanation In-Reply-To: <08F44F1F-497E-4F61-BCFC-945A54751126@anl.gov> References: <08F44F1F-497E-4F61-BCFC-945A54751126@anl.gov> Message-ID: I think when I have installed PETSc, I did --download-mpich. How can I correct it? Shall I install PETSc again? Thanks. Sincerely, Huq On Sat, Nov 17, 2018 at 4:53 PM Smith, Barry F. wrote: > > > > On Nov 17, 2018, at 3:51 PM, Fazlul Huq via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Thanks for the answer. > > > > So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 > ./ex6" shouldn't it spit out > > the error "This is a uniprocessor example only!"? > > But it didn't do that rather run the code 2/4 times. > > This can happen if you use the "wrong" mpiexec; that is a different > mpiexec than the one associated with the mpi you built PETSc with. For > example if some mpiexec is in your path and you used ./configure > --download-mpich > > Barry > > > > > > Thanks. > > Sincerely, > > Huq > > > > On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq wrote: > > Hello PETSc developers, > > > > In example 6 or vec section I don't understand this line: > > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor example > only!"); > > > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 > ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. > > > > Thanks. > > > > Sincerely, > > Huq > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Nov 17 17:26:45 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Sat, 17 Nov 2018 23:26:45 +0000 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: <08F44F1F-497E-4F61-BCFC-945A54751126@anl.gov> Message-ID: Then use ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec don't just use the random (wrong) mpiexec that is in your path. Barry > On Nov 17, 2018, at 5:22 PM, Fazlul Huq wrote: > > I think when I have installed PETSc, I did --download-mpich. > How can I correct it? Shall I install PETSc again? > > Thanks. > Sincerely, > Huq > > On Sat, Nov 17, 2018 at 4:53 PM Smith, Barry F. wrote: > > > > On Nov 17, 2018, at 3:51 PM, Fazlul Huq via petsc-users wrote: > > > > Thanks for the answer. > > > > So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" shouldn't it spit out > > the error "This is a uniprocessor example only!"? > > But it didn't do that rather run the code 2/4 times. > > This can happen if you use the "wrong" mpiexec; that is a different mpiexec than the one associated with the mpi you built PETSc with. For example if some mpiexec is in your path and you used ./configure --download-mpich > > Barry > > > > > > Thanks. > > Sincerely, > > Huq > > > > On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq wrote: > > Hello PETSc developers, > > > > In example 6 or vec section I don't understand this line: > > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor example only!"); > > > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. > > > > Thanks. > > > > Sincerely, > > Huq > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com From huq2090 at gmail.com Sat Nov 17 21:34:28 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Sat, 17 Nov 2018 21:34:28 -0600 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: <08F44F1F-497E-4F61-BCFC-945A54751126@anl.gov> Message-ID: I tried to reconfigure PETSc and I got the following error message: .................................................................................................................. Unable to find mpi in default locations! Perhaps you can specify with --with-mpi-dir= If you do not want MPI, then give --with-mpi=0 You might also consider using --download-mpich instead ................................................................................................................. I also got this error message last time when I have installed first and I have used --download-mpich command. The log file is attached herewith along with terminal screenshot. I am not sure what to do now. Thanks. Sincerely, Huq On Sat, Nov 17, 2018 at 5:26 PM Smith, Barry F. wrote: > > Then use > > ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec > > don't just use the random (wrong) mpiexec that is in your path. > > Barry > > > > On Nov 17, 2018, at 5:22 PM, Fazlul Huq wrote: > > > > I think when I have installed PETSc, I did --download-mpich. > > How can I correct it? Shall I install PETSc again? > > > > Thanks. > > Sincerely, > > Huq > > > > On Sat, Nov 17, 2018 at 4:53 PM Smith, Barry F. > wrote: > > > > > > > On Nov 17, 2018, at 3:51 PM, Fazlul Huq via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > > > Thanks for the answer. > > > > > > So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 > ./ex6" shouldn't it spit out > > > the error "This is a uniprocessor example only!"? > > > But it didn't do that rather run the code 2/4 times. > > > > This can happen if you use the "wrong" mpiexec; that is a different > mpiexec than the one associated with the mpi you built PETSc with. For > example if some mpiexec is in your path and you used ./configure > --download-mpich > > > > Barry > > > > > > > > > > Thanks. > > > Sincerely, > > > Huq > > > > > > On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq wrote: > > > Hello PETSc developers, > > > > > > In example 6 or vec section I don't understand this line: > > > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor > example only!"); > > > > > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 > ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. > > > > > > Thanks. > > > > > > Sincerely, > > > Huq > > > > > > -- > > > > > > Fazlul Huq > > > Graduate Research Assistant > > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > > University of Illinois at Urbana-Champaign (UIUC) > > > E-mail: huq2090 at gmail.com > > > > > > > > > -- > > > > > > Fazlul Huq > > > Graduate Research Assistant > > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > > University of Illinois at Urbana-Champaign (UIUC) > > > E-mail: huq2090 at gmail.com > > > > > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 2298875 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot from 2018-11-17 21-26-56.png Type: image/png Size: 50731 bytes Desc: not available URL: From knepley at gmail.com Sun Nov 18 06:49:15 2018 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 18 Nov 2018 07:49:15 -0500 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: <08F44F1F-497E-4F61-BCFC-945A54751126@anl.gov> Message-ID: On Sat, Nov 17, 2018 at 10:37 PM Fazlul Huq via petsc-users < petsc-users at mcs.anl.gov> wrote: > I tried to reconfigure PETSc and I got the following error message: > Do not reconfigure, just use the mpiexec in $PETSC_DIR/$PETSC_ARCH/bin as Barry suggested. Matt > > .................................................................................................................. > Unable to find mpi in default locations! > Perhaps you can specify with --with-mpi-dir= > If you do not want MPI, then give --with-mpi=0 > You might also consider using --download-mpich instead > > ................................................................................................................. > I also got this error message last time when I have installed first and I > have used --download-mpich command. > > The log file is attached herewith along with terminal screenshot. > I am not sure what to do now. > > Thanks. > Sincerely, > Huq > > > On Sat, Nov 17, 2018 at 5:26 PM Smith, Barry F. > wrote: > >> >> Then use >> >> ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec >> >> don't just use the random (wrong) mpiexec that is in your path. >> >> Barry >> >> >> > On Nov 17, 2018, at 5:22 PM, Fazlul Huq wrote: >> > >> > I think when I have installed PETSc, I did --download-mpich. >> > How can I correct it? Shall I install PETSc again? >> > >> > Thanks. >> > Sincerely, >> > Huq >> > >> > On Sat, Nov 17, 2018 at 4:53 PM Smith, Barry F. >> wrote: >> > >> > >> > > On Nov 17, 2018, at 3:51 PM, Fazlul Huq via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> > > >> > > Thanks for the answer. >> > > >> > > So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 >> ./ex6" shouldn't it spit out >> > > the error "This is a uniprocessor example only!"? >> > > But it didn't do that rather run the code 2/4 times. >> > >> > This can happen if you use the "wrong" mpiexec; that is a different >> mpiexec than the one associated with the mpi you built PETSc with. For >> example if some mpiexec is in your path and you used ./configure >> --download-mpich >> > >> > Barry >> > >> > >> > > >> > > Thanks. >> > > Sincerely, >> > > Huq >> > > >> > > On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq wrote: >> > > Hello PETSc developers, >> > > >> > > In example 6 or vec section I don't understand this line: >> > > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor >> example only!"); >> > > >> > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 >> ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. >> > > >> > > Thanks. >> > > >> > > Sincerely, >> > > Huq >> > > >> > > -- >> > > >> > > Fazlul Huq >> > > Graduate Research Assistant >> > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> > > University of Illinois at Urbana-Champaign (UIUC) >> > > E-mail: huq2090 at gmail.com >> > > >> > > >> > > -- >> > > >> > > Fazlul Huq >> > > Graduate Research Assistant >> > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> > > University of Illinois at Urbana-Champaign (UIUC) >> > > E-mail: huq2090 at gmail.com >> > >> > >> > >> > -- >> > >> > Fazlul Huq >> > Graduate Research Assistant >> > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> > University of Illinois at Urbana-Champaign (UIUC) >> > E-mail: huq2090 at gmail.com >> >> > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Sun Nov 18 09:31:15 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Sun, 18 Nov 2018 09:31:15 -0600 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: <08F44F1F-497E-4F61-BCFC-945A54751126@anl.gov> Message-ID: So I have to remove mpiexec from usr/bin and keep it in home/petsc-3.10.2/lib/petsc/bin Is it? Thanks. Sincerely, Huq On Sun, Nov 18, 2018 at 6:49 AM Matthew Knepley wrote: > On Sat, Nov 17, 2018 at 10:37 PM Fazlul Huq via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> I tried to reconfigure PETSc and I got the following error message: >> > > Do not reconfigure, just use the mpiexec in $PETSC_DIR/$PETSC_ARCH/bin as > Barry suggested. > > Matt > > >> >> .................................................................................................................. >> Unable to find mpi in default locations! >> Perhaps you can specify with --with-mpi-dir= >> If you do not want MPI, then give --with-mpi=0 >> You might also consider using --download-mpich instead >> >> ................................................................................................................. >> I also got this error message last time when I have installed first and I >> have used --download-mpich command. >> >> The log file is attached herewith along with terminal screenshot. >> I am not sure what to do now. >> >> Thanks. >> Sincerely, >> Huq >> >> >> On Sat, Nov 17, 2018 at 5:26 PM Smith, Barry F. >> wrote: >> >>> >>> Then use >>> >>> ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec >>> >>> don't just use the random (wrong) mpiexec that is in your path. >>> >>> Barry >>> >>> >>> > On Nov 17, 2018, at 5:22 PM, Fazlul Huq wrote: >>> > >>> > I think when I have installed PETSc, I did --download-mpich. >>> > How can I correct it? Shall I install PETSc again? >>> > >>> > Thanks. >>> > Sincerely, >>> > Huq >>> > >>> > On Sat, Nov 17, 2018 at 4:53 PM Smith, Barry F. >>> wrote: >>> > >>> > >>> > > On Nov 17, 2018, at 3:51 PM, Fazlul Huq via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> > > >>> > > Thanks for the answer. >>> > > >>> > > So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 >>> ./ex6" shouldn't it spit out >>> > > the error "This is a uniprocessor example only!"? >>> > > But it didn't do that rather run the code 2/4 times. >>> > >>> > This can happen if you use the "wrong" mpiexec; that is a different >>> mpiexec than the one associated with the mpi you built PETSc with. For >>> example if some mpiexec is in your path and you used ./configure >>> --download-mpich >>> > >>> > Barry >>> > >>> > >>> > > >>> > > Thanks. >>> > > Sincerely, >>> > > Huq >>> > > >>> > > On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq >>> wrote: >>> > > Hello PETSc developers, >>> > > >>> > > In example 6 or vec section I don't understand this line: >>> > > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor >>> example only!"); >>> > > >>> > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 >>> ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. >>> > > >>> > > Thanks. >>> > > >>> > > Sincerely, >>> > > Huq >>> > > >>> > > -- >>> > > >>> > > Fazlul Huq >>> > > Graduate Research Assistant >>> > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>> > > University of Illinois at Urbana-Champaign (UIUC) >>> > > E-mail: huq2090 at gmail.com >>> > > >>> > > >>> > > -- >>> > > >>> > > Fazlul Huq >>> > > Graduate Research Assistant >>> > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>> > > University of Illinois at Urbana-Champaign (UIUC) >>> > > E-mail: huq2090 at gmail.com >>> > >>> > >>> > >>> > -- >>> > >>> > Fazlul Huq >>> > Graduate Research Assistant >>> > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>> > University of Illinois at Urbana-Champaign (UIUC) >>> > E-mail: huq2090 at gmail.com >>> >>> >> >> -- >> >> Fazlul Huq >> Graduate Research Assistant >> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> University of Illinois at Urbana-Champaign (UIUC) >> E-mail: huq2090 at gmail.com >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Nov 18 09:57:22 2018 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 18 Nov 2018 10:57:22 -0500 Subject: [petsc-users] Expecting Explanation In-Reply-To: References: <08F44F1F-497E-4F61-BCFC-945A54751126@anl.gov> Message-ID: On Sun, Nov 18, 2018 at 10:31 AM Fazlul Huq wrote: > So I have to remove mpiexec from usr/bin > and keep it in home/petsc-3.10.2/lib/petsc/bin > Is it? > No, just do not call /usr/bin/mpiexec. Instead use $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 4 ./myprogram when running a program. Thanks, Matt > Thanks. > Sincerely, > Huq > > On Sun, Nov 18, 2018 at 6:49 AM Matthew Knepley wrote: > >> On Sat, Nov 17, 2018 at 10:37 PM Fazlul Huq via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> I tried to reconfigure PETSc and I got the following error message: >>> >> >> Do not reconfigure, just use the mpiexec in $PETSC_DIR/$PETSC_ARCH/bin as >> Barry suggested. >> >> Matt >> >> >>> >>> .................................................................................................................. >>> Unable to find mpi in default locations! >>> Perhaps you can specify with --with-mpi-dir= >>> If you do not want MPI, then give --with-mpi=0 >>> You might also consider using --download-mpich instead >>> >>> ................................................................................................................. >>> I also got this error message last time when I have installed first and >>> I have used --download-mpich command. >>> >>> The log file is attached herewith along with terminal screenshot. >>> I am not sure what to do now. >>> >>> Thanks. >>> Sincerely, >>> Huq >>> >>> >>> On Sat, Nov 17, 2018 at 5:26 PM Smith, Barry F. >>> wrote: >>> >>>> >>>> Then use >>>> >>>> ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec >>>> >>>> don't just use the random (wrong) mpiexec that is in your path. >>>> >>>> Barry >>>> >>>> >>>> > On Nov 17, 2018, at 5:22 PM, Fazlul Huq wrote: >>>> > >>>> > I think when I have installed PETSc, I did --download-mpich. >>>> > How can I correct it? Shall I install PETSc again? >>>> > >>>> > Thanks. >>>> > Sincerely, >>>> > Huq >>>> > >>>> > On Sat, Nov 17, 2018 at 4:53 PM Smith, Barry F. >>>> wrote: >>>> > >>>> > >>>> > > On Nov 17, 2018, at 3:51 PM, Fazlul Huq via petsc-users < >>>> petsc-users at mcs.anl.gov> wrote: >>>> > > >>>> > > Thanks for the answer. >>>> > > >>>> > > So, when I run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 >>>> ./ex6" shouldn't it spit out >>>> > > the error "This is a uniprocessor example only!"? >>>> > > But it didn't do that rather run the code 2/4 times. >>>> > >>>> > This can happen if you use the "wrong" mpiexec; that is a >>>> different mpiexec than the one associated with the mpi you built PETSc >>>> with. For example if some mpiexec is in your path and you used ./configure >>>> --download-mpich >>>> > >>>> > Barry >>>> > >>>> > >>>> > > >>>> > > Thanks. >>>> > > Sincerely, >>>> > > Huq >>>> > > >>>> > > On Sat, Nov 17, 2018 at 1:12 PM Fazlul Huq >>>> wrote: >>>> > > Hello PETSc developers, >>>> > > >>>> > > In example 6 or vec section I don't understand this line: >>>> > > if (size != 1) SETERRQ(PETSC_COMM_SELF,1,"This is a uniprocessor >>>> example only!"); >>>> > > >>>> > > I tried to run the code with "mpiexec -n 2 ./ex6" or "mpiexec -n 4 >>>> ./ex6" or even "mpiexec -n 1 ./ex6" it the code runs properly. >>>> > > >>>> > > Thanks. >>>> > > >>>> > > Sincerely, >>>> > > Huq >>>> > > >>>> > > -- >>>> > > >>>> > > Fazlul Huq >>>> > > Graduate Research Assistant >>>> > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>>> > > University of Illinois at Urbana-Champaign (UIUC) >>>> > > E-mail: huq2090 at gmail.com >>>> > > >>>> > > >>>> > > -- >>>> > > >>>> > > Fazlul Huq >>>> > > Graduate Research Assistant >>>> > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>>> > > University of Illinois at Urbana-Champaign (UIUC) >>>> > > E-mail: huq2090 at gmail.com >>>> > >>>> > >>>> > >>>> > -- >>>> > >>>> > Fazlul Huq >>>> > Graduate Research Assistant >>>> > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>>> > University of Illinois at Urbana-Champaign (UIUC) >>>> > E-mail: huq2090 at gmail.com >>>> >>>> >>> >>> -- >>> >>> Fazlul Huq >>> Graduate Research Assistant >>> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>> University of Illinois at Urbana-Champaign (UIUC) >>> E-mail: huq2090 at gmail.com >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Sun Nov 18 12:25:41 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Sun, 18 Nov 2018 12:25:41 -0600 Subject: [petsc-users] Solving problem using multigrid Message-ID: Hello PETSc developers, I have solved a problem using ksp. Please find the problem in attachment 1 and PETSc code for solution in attachment 2. Now I need to solve the same problem using multigrid. I never solved problem using multigrid. Can you please guide me about steps that I should follow to solve this problem using multigrid. Thanks. Sincerely, Huq -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: IMG_20181118_121312.jpg Type: image/jpeg Size: 2737527 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: poisson_m.c Type: text/x-csrc Size: 8569 bytes Desc: not available URL: From mfadams at lbl.gov Sun Nov 18 16:23:28 2018 From: mfadams at lbl.gov (Mark Adams) Date: Sun, 18 Nov 2018 17:23:28 -0500 Subject: [petsc-users] Solving problem using multigrid In-Reply-To: References: Message-ID: The manual is pretty clear on this, but your code is purely algebraic, that is you do not use a DM, and so you will need to use algebraic multigrid (AMG). So you want to look at AMG preconditioners (-pc_type gamg [or hypre if you configured with --download-hypre]). Mark On Sun, Nov 18, 2018 at 1:28 PM Fazlul Huq via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hello PETSc developers, > > I have solved a problem using ksp. Please find the problem in attachment 1 > and PETSc code for solution in attachment 2. > Now I need to solve the same problem using multigrid. > I never solved problem using multigrid. Can you please guide me about > steps that I should follow > to solve this problem using multigrid. > > Thanks. > Sincerely, > Huq > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amartin at cimne.upc.edu Mon Nov 19 04:52:12 2018 From: amartin at cimne.upc.edu (=?UTF-8?B?IkFsYmVydG8gRi4gTWFydMOtbiI=?=) Date: Mon, 19 Nov 2018 11:52:12 +0100 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue In-Reply-To: References: <5BE30C87.10204@cimne.upc.edu> <5BE420E4.8020800@cimne.upc.edu> <5BE42729.3090307@cimne.upc.edu> Message-ID: <5BF295DC.30801@cimne.upc.edu> Dear Mark, Dear Matthew, in order to discard load imbalance as the cause of the reported weak scaling issue in the GAMG preconditioner set-up stage (as you said, we were feeding GAMG with a suboptimal mesh distribution, having empty processors, among others), we simplified the weak scaling test by considering the standard body-fitted trilinear (Q1) FE discretization of the 3D Poisson problem on a unit cube discretized with a */uniform, structured hexahedral mesh,/* /*partitioned*//* *//*optimally (by hand) among processors*/, with a fixed load/core of 30**3 hexahedra/core. Thus, all processors have the same load (up-to strong Dirichlet boundary conditions on the subdomains touching the global boundary), and the edge-cut is minimum. We used the following GAMG preconditioner options: -pc_type gamg -pc_gamg_type agg -pc_gamg_est_ksp_type cg -mg_levels_esteig_ksp_type cg -mg_coarse_sub_pc_type cholesky -mg_coarse_sub_pc_factor_mat_ordering_type nd -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 10 -pc_gamg_agg_nsmooths 1 The results that we obtained for 48 (4x4x3 subdomains), 10,368 (24x24x18 subdomains), and 16,464 (28x28x21 subdomains) CPU cores are as follows: **preconditioner set up** [0.9844961860, *7.017674042*, *12.10154881*] **PCG stage** [0.5849160422, 1.515251888, 1.859617710] **number of PCG iterations** [9,14,15] As you can observe, *there is still a significant time increase when scaling the problem from 48 to 10K/16K MPI tasks** **for the preconditioner setup stage. *This time increase is not as significant for the PCG stage.**Please find attached the combined output of -ksp_view and -log_view for these three points of the weak scaling curve. Given these results, I am starting to suspect that something within the underlying software + hardware stack might be responsible for this. I am using OpenMPI 1.10.7 + Intel compilers version 18.0. The underlying supercomputer is MN-IV at BSC (https://www.bsc.es/marenostrum/marenostrum/technical-information). Have you ever conducted a weak scaling test of GAMG with OpenMPI on a similar computer architecture? Can you share your experience with us? (versions tested, outcome, etc.) We also tried an alternative MPI library, Intel(R) MPI Library for Linux* OS, Version 2018 Update 4 Build 20180823 (id: 18555), *without success. *For this MPI library, the preconditioner set-up stage crashes (find attached stack frames, and internal MPI library errors) for the largest two core counts (it did not crash for the 48 CPU cores case), while it did not crash with OpenMPI 1.10.7. Have you ever experienced errors like the ones attached? Is there anyway to set up PETSc such that the subroutine that crashes is replaced by an alternative implementation of the same concept? (this would be just a workaround). It might be a BUG in the Intel MPI library, although I cannot confirm it. We also got these errors with the unfitted FEM+space-filling curves version of our code. Thanks a lot for your help and valuable feedback! Best regards, Alberto. On 08/11/18 17:29, Mark Adams wrote: > > > I did not configured PETSc with ParMetis support. Should I? > > I figured it out when I tried to use "-pc_gamg_repartition". PETSc > complained that it was not compiled with ParMetis support. > > > You need ParMetis, or some parallel mesh partitioner, configured to > use repartitioning. I would guess that "-pc_gamg_repartition" would > not help and might hurt, because it just does the coarse grids, not > the fine grid. But it is worth a try. Just configure with > --download-parmetis > > The problem is that you are using space filling curves on the > background grid and are getting empty processors. Right? The mesh > setup phase is not super optimized, but your times > > And you said in your attachment that you added the near null space, > but just the constant vector. I trust you mean the three translational > rigid body modes. That is the default and so you should not see any > difference. If you added one vector of all 1s then that would be bad. > You also want the rotational rigid body modes. Now, you are converging > pretty well and if your solution does not have much rotation in it the > the rotational modes are not needed, but they are required for > optimality in general. > -- Alberto F. Mart?n-Huertas Senior Researcher, PhD. Computational Science Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) Parc Mediterrani de la Tecnologia, UPC Esteve Terradas 5, Building C3, Office 215, 08860 Castelldefels (Barcelona, Spain) Tel.: (+34) 9341 34223 e-mail:amartin at cimne.upc.edu FEMPAR project co-founder web: http://www.fempar.org ________________ IMPORTANT NOTICE All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- KSP Object: 48 MPI processes type: cg maximum iterations=500, initial guess is zero tolerances: relative=1e-08, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 48 MPI processes type: gamg type is MULTIPLICATIVE, levels=4 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 10 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 48 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 48 MPI processes type: bjacobi number of blocks = 48 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=6, cols=6 package used to perform factorization: petsc total: nonzeros=21, allocated nonzeros=21 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=6, cols=6 total: nonzeros=36, allocated nonzeros=36 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 48 MPI processes type: mpiaij rows=6, cols=6 total: nonzeros=36, allocated nonzeros=36 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 48 MPI processes type: chebyshev eigenvalue estimates used: min = 0.135917, max = 1.49508 eigenvalues estimate via cg min 0.269407, max 1.35917 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 48 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 48 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 48 MPI processes type: mpiaij rows=359, cols=359 total: nonzeros=17751, allocated nonzeros=17751 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 48 MPI processes type: chebyshev eigenvalue estimates used: min = 0.137555, max = 1.5131 eigenvalues estimate via cg min 0.0576414, max 1.37555 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 48 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 48 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 48 MPI processes type: mpiaij rows=27123, cols=27123 total: nonzeros=1144833, allocated nonzeros=1144833 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 48 MPI processes type: chebyshev eigenvalue estimates used: min = 0.136454, max = 1.50099 eigenvalues estimate via cg min 0.0376067, max 1.36454 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 48 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 48 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 48 MPI processes type: mpiaij rows=1260329, cols=1260329 total: nonzeros=33396625, allocated nonzeros=252065800 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 48 MPI processes type: mpiaij rows=1260329, cols=1260329 total: nonzeros=33396625, allocated nonzeros=252065800 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /gpfs/scratch/upc26/upc26229/build_rel_fempar_cell_agg_ompi/FEMPAR/bin/par_test_poisson_unfitted on a arch-linux2-c-opt named s05r2b62 with 48 processors, by upc26229 Fri Nov 16 09:37:39 2018 Using Petsc Release Version 3.9.0, Apr, 07, 2018 Max Max/Min Avg Total Time (sec): 2.318e+01 1.00000 2.318e+01 Objects: 1.250e+03 1.00241 1.247e+03 Flop: 8.936e+08 1.13741 8.557e+08 4.107e+10 Flop/sec: 3.855e+07 1.13741 3.692e+07 1.772e+09 MPI Messages: 2.001e+04 3.63905 1.109e+04 5.322e+05 MPI Message Lengths: 7.575e+07 2.25574 4.717e+03 2.510e+09 MPI Reductions: 1.773e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 2.3173e+01 100.0% 4.1074e+10 100.0% 5.322e+05 100.0% 4.717e+03 100.0% 1.759e+03 99.2% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSided 9 1.0 8.8800e-03 4.2 0.00e+00 0.0 2.0e+03 8.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSidedF 126 1.0 4.4196e-01 5.4 0.00e+00 0.0 1.6e+04 8.5e+04 0.0e+00 1 0 3 55 0 1 0 3 55 0 0 VecMDot 90 1.0 2.2851e-02 3.4 9.11e+06 1.1 0.0e+00 0.0e+00 9.0e+01 0 1 0 0 5 0 1 0 0 5 18597 VecTDot 243 1.0 9.9924e-02 9.2 6.39e+06 1.1 0.0e+00 0.0e+00 2.4e+02 0 1 0 0 14 0 1 0 0 14 2986 VecNorm 228 1.0 3.5723e-02 1.8 5.26e+06 1.1 0.0e+00 0.0e+00 2.3e+02 0 1 0 0 13 0 1 0 0 13 6875 VecScale 99 1.0 7.8892e-0313.7 9.11e+05 1.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 5387 VecCopy 114 1.0 2.5488e-03 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 483 1.0 1.8528e-03 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 243 1.0 6.3856e-03 1.2 6.39e+06 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 46727 VecAYPX 753 1.0 2.2839e-02 1.3 1.02e+07 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 20918 VecAXPBYCZ 324 1.0 1.2422e-02 1.1 1.49e+07 1.1 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 55982 VecMAXPY 99 1.0 9.3288e-03 2.0 1.08e+07 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 53838 VecAssemblyBegin 63 1.0 1.4764e-02 2.5 0.00e+00 0.0 1.5e+03 3.1e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 63 1.0 3.3262e-04 3.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 99 1.0 3.4623e-03 1.5 9.11e+05 1.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 12275 VecScatterBegin 933 1.0 3.9603e-02 2.8 0.00e+00 0.0 3.9e+05 1.3e+03 0.0e+00 0 0 74 21 0 0 0 74 21 0 0 VecScatterEnd 933 1.0 1.1539e-01 4.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSetRandom 9 1.0 2.5657e-03 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 99 1.0 2.0392e-02 2.0 2.73e+06 1.1 0.0e+00 0.0e+00 9.9e+01 0 0 0 0 6 0 0 0 0 6 6252 MatMult 693 1.0 7.2101e-01 1.2 3.68e+08 1.1 3.2e+05 1.4e+03 0.0e+00 3 41 60 17 0 3 41 60 17 0 23339 MatMultAdd 81 1.0 3.0372e-02 1.6 6.32e+06 1.2 2.1e+04 1.5e+02 0.0e+00 0 1 4 0 0 0 1 4 0 0 9453 MatMultTranspose 81 1.0 6.0862e-02 3.8 6.32e+06 1.2 2.1e+04 1.5e+02 0.0e+00 0 1 4 0 0 0 1 4 0 0 4717 MatSolve 27 0.0 7.1588e-05 0.0 1.78e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 25 MatSOR 585 1.0 9.6473e-01 1.1 2.73e+08 1.1 0.0e+00 0.0e+00 0.0e+00 4 31 0 0 0 4 31 0 0 0 13218 MatCholFctrSym 3 1.0 8.9991e-03497.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 3 1.0 6.3463e-031399.7 1.80e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 9 1.0 3.5641e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 27 1.0 2.1565e-02 1.2 5.23e+06 1.1 4.1e+03 1.3e+03 0.0e+00 0 1 1 0 0 0 1 1 0 0 11095 MatResidual 81 1.0 8.1534e-02 1.4 4.08e+07 1.1 3.7e+04 1.3e+03 0.0e+00 0 5 7 2 0 0 5 7 2 0 22889 MatAssemblyBegin 252 1.0 5.7321e-01 1.5 0.00e+00 0.0 1.5e+04 9.3e+04 0.0e+00 2 0 3 55 0 2 0 3 55 0 0 MatAssemblyEnd 252 1.0 5.3497e-01 1.1 0.00e+00 0.0 4.1e+04 5.6e+02 4.8e+02 2 0 8 1 27 2 0 8 1 27 0 MatGetRow 248445 1.1 2.9776e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 3 0.0 1.2791e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 12 1.0 2.6623e-02 1.0 0.00e+00 0.0 2.5e+03 3.8e+02 1.9e+02 0 0 0 0 11 0 0 0 0 11 0 MatGetOrdering 3 0.0 1.5566e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 9 1.0 4.3990e-02 1.5 0.00e+00 0.0 2.9e+04 2.5e+03 3.6e+01 0 0 5 3 2 0 0 5 3 2 0 MatZeroEntries 18 1.0 5.3911e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 21 1.4 6.7606e-02 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 1.5e+01 0 0 0 0 1 0 0 0 0 1 0 MatAXPY 9 1.0 2.6150e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 9 1.0 1.2417e-01 1.2 4.53e+06 1.1 2.3e+04 9.4e+02 1.1e+02 1 1 4 1 6 1 1 4 1 6 1670 MatMatMultSym 9 1.0 1.0047e-01 1.2 0.00e+00 0.0 1.9e+04 8.7e+02 1.1e+02 0 0 4 1 6 0 0 4 1 6 0 MatMatMultNum 9 1.0 2.3698e-02 1.0 4.53e+06 1.1 4.1e+03 1.3e+03 0.0e+00 0 1 1 0 0 0 1 1 0 0 8750 MatPtAP 9 1.0 2.8118e-01 1.0 2.87e+07 1.2 3.6e+04 2.7e+03 1.4e+02 1 3 7 4 8 1 3 7 4 8 4543 MatPtAPSymbolic 9 1.0 2.0049e-01 1.0 0.00e+00 0.0 2.2e+04 3.0e+03 6.3e+01 1 0 4 3 4 1 0 4 3 4 0 MatPtAPNumeric 9 1.0 8.0681e-02 1.0 2.87e+07 1.2 1.4e+04 2.2e+03 7.2e+01 0 3 3 1 4 0 3 3 1 4 15832 MatTrnMatMult 9 1.0 1.9582e+00 1.0 1.37e+08 1.2 2.6e+04 6.3e+04 1.4e+02 8 15 5 66 8 8 15 5 66 8 3150 MatTrnMatMultSym 9 1.0 9.0854e-01 1.0 0.00e+00 0.0 1.2e+04 3.5e+04 6.6e+01 4 0 2 17 4 4 0 2 17 4 0 MatTrnMatMultNum 9 1.0 1.0519e+00 1.0 1.37e+08 1.2 1.4e+04 8.7e+04 7.2e+01 5 15 3 49 4 5 15 3 49 4 5864 MatGetLocalMat 36 1.0 5.1365e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 27 1.0 4.9700e-02 2.0 0.00e+00 0.0 2.8e+04 2.7e+03 0.0e+00 0 0 5 3 0 0 0 5 3 0 0 KSPGMRESOrthog 90 1.0 3.1736e-02 2.3 1.82e+07 1.1 0.0e+00 0.0e+00 9.0e+01 0 2 0 0 5 0 2 0 0 5 26782 KSPSetUp 36 1.0 1.1924e-02 1.9 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 1 0 0 0 0 1 0 KSPSolve 3 1.0 1.7241e+00 1.0 6.50e+08 1.1 3.2e+05 1.2e+03 3.7e+02 7 73 60 15 21 7 73 60 15 21 17425 PCGAMGGraph_AGG 9 1.0 3.1316e-01 1.0 4.53e+06 1.1 1.2e+04 8.7e+02 1.1e+02 1 1 2 0 6 1 1 2 0 6 662 PCGAMGCoarse_AGG 9 1.0 2.0449e+00 1.0 1.37e+08 1.2 7.6e+04 2.4e+04 2.1e+02 9 15 14 71 12 9 15 14 71 12 3017 PCGAMGProl_AGG 9 1.0 1.2541e-01 1.1 0.00e+00 0.0 1.5e+04 1.9e+03 1.4e+02 1 0 3 1 8 1 0 3 1 8 0 PCGAMGPOpt_AGG 9 1.0 3.4309e-01 1.0 7.33e+07 1.1 6.4e+04 1.2e+03 3.7e+02 1 8 12 3 21 1 8 12 3 21 9849 GAMG: createProl 9 1.0 2.8587e+00 1.0 2.15e+08 1.2 1.7e+05 1.1e+04 8.3e+02 12 24 31 76 47 12 24 31 76 47 3412 Graph 18 1.0 3.0684e-01 1.0 4.53e+06 1.1 1.2e+04 8.7e+02 1.1e+02 1 1 2 0 6 1 1 2 0 6 676 MIS/Agg 9 1.0 4.4147e-02 1.4 0.00e+00 0.0 2.9e+04 2.5e+03 3.6e+01 0 0 5 3 2 0 0 5 3 2 0 SA: col data 9 1.0 1.2357e-02 1.1 0.00e+00 0.0 8.2e+03 2.8e+03 3.6e+01 0 0 2 1 2 0 0 2 1 2 0 SA: frmProl0 9 1.0 1.0247e-01 1.0 0.00e+00 0.0 6.7e+03 6.5e+02 7.2e+01 0 0 1 0 4 0 0 1 0 4 0 SA: smooth 9 1.0 1.6341e-01 1.1 5.23e+06 1.1 2.3e+04 9.4e+02 1.3e+02 1 1 4 1 7 1 1 4 1 7 1464 GAMG: partLevel 9 1.0 3.2724e-01 1.0 2.87e+07 1.2 3.9e+04 2.5e+03 4.4e+02 1 3 7 4 25 1 3 7 4 25 3903 repartition 6 1.0 2.7110e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 0 0 0 0 2 0 0 0 0 2 0 Invert-Sort 6 1.0 7.3382e-03 5.8 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 1 0 0 0 0 1 0 Move A 6 1.0 2.5487e-02 1.3 0.00e+00 0.0 1.1e+03 8.5e+02 1.0e+02 0 0 0 0 6 0 0 0 0 6 0 Move P 6 1.0 7.7925e-03 1.0 0.00e+00 0.0 1.4e+03 3.2e+01 1.0e+02 0 0 0 0 6 0 0 0 0 6 0 PCSetUp 6 1.0 3.2475e+00 1.0 2.44e+08 1.2 2.0e+05 9.7e+03 1.3e+03 14 27 38 79 75 14 27 38 79 75 3397 PCSetUpOnBlocks 27 1.0 1.6300e-02 1.6 1.80e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 27 1.0 1.6327e+00 1.0 6.03e+08 1.1 3.0e+05 1.1e+03 2.9e+02 7 68 57 14 16 7 68 57 14 16 17066 SFSetGraph 9 1.0 7.1678e-032989.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFSetUp 9 1.0 1.6463e-02 2.2 0.00e+00 0.0 6.1e+03 1.9e+03 0.0e+00 0 0 1 0 0 0 0 1 0 0 0 SFBcastBegin 54 1.0 3.5743e-03 3.4 0.00e+00 0.0 2.3e+04 2.6e+03 0.0e+00 0 0 4 2 0 0 0 4 2 0 0 SFBcastEnd 54 1.0 3.3646e-03 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 510 510 27602496 0. Matrix 348 348 622432056 0. Matrix Coarsen 9 9 6156 0. Index Set 213 213 558288 0. Vec Scatter 75 75 105384 0. Krylov Solver 36 36 314928 0. Preconditioner 27 27 29544 0. Viewer 5 4 3584 0. PetscRandom 18 18 12492 0. Star Forest Graph 9 9 8496 0. ======================================================================================================================== Average time to get PetscTime(): 4.56115e-08 Average time for MPI_Barrier(): 5.97197e-06 Average time for zero size MPI_Send(): 3.71554e-06 #PETSc Option Table entries: --prefix popcorn3d_full_l3_s1 -beta 7.0 -betaest .true. -check .false. -datadt data_distribution_fully_assembled -dm 3 -in_space .true. -ksp_converged_reason -ksp_max_it 500 -ksp_monitor -ksp_norm_type unpreconditioned -ksp_rtol 1.0e-8 -ksp_type cg -ksp_view -l 1 -levelset popcorn -levelsettol 0.0 -log_view -lsdom -0.1 -mg_coarse_sub_pc_factor_mat_ordering_type nd -mg_coarse_sub_pc_type cholesky -mg_levels_esteig_ksp_type cg -n 120 -no_signal_handler -nruns 3 -pc_gamg_agg_nsmooths 1 -pc_gamg_est_ksp_type cg -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 10 -pc_gamg_type agg -pc_type gamg -petscrc /gpfs/scratch/upc26/upc26229/NEW_STUFF/time_par_cell_agg_ompi.paper/petscrc-0 -tt 1 -uagg .false. -wsolution .false. #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 8 Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 -with-blaslapack-dir=/apps/INTEL/2017.4/mkl --with-debugging=0 --with-x=0 --with-shared-libraries=1 --with-mpi=1 --with-64-bit-indices --download-hypre=../v2.14.0.tar.gz ----------------------------------------- Libraries compiled on 2018-11-07 17:23:07 on login1 Machine characteristics: Linux-4.4.120-92.70-default-x86_64-with-SuSE-12-x86_64 Using PETSc directory: /gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0 Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -wd1572 -g -O3 Using Fortran compiler: mpif90 -fPIC -g -O3 ----------------------------------------- Using include paths: -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/include -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -Wl,-rpath,/apps/INTEL/2017.4/mkl/lib/intel64 -L/apps/INTEL/2017.4/mkl/lib/intel64 -Wl,-rpath,/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -L/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -Wl,-rpath,/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -L/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -Wl,-rpath,/usr/lib64/gcc/x86_64-suse-linux/4.8 -L/usr/lib64/gcc/x86_64-suse-linux/4.8 -Wl,-rpath,/usr/x86_64-suse-linux/lib -L/usr/x86_64-suse-linux/lib -lHYPRE -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lifport -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lpthread -lgcc_s -lirc_s -lstdc++ -ldl ----------------------------------------- Ending run at vie nov 16 09:37:40 CET 2018 -------------- next part -------------- Linear solve converged due to CONVERGED_RTOL iterations 14 KSP Object: 10368 MPI processes type: cg maximum iterations=500, initial guess is zero tolerances: relative=1e-08, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 10368 MPI processes type: gamg type is MULTIPLICATIVE, levels=5 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 10 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 10368 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 10368 MPI processes type: bjacobi number of blocks = 10368 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=7, cols=7 package used to perform factorization: petsc total: nonzeros=28, allocated nonzeros=28 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 10368 MPI processes type: mpiaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 10368 MPI processes type: chebyshev eigenvalue estimates used: min = 0.132992, max = 1.46291 eigenvalues estimate via cg min 0.252003, max 1.32992 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 10368 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 10368 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 10368 MPI processes type: mpiaij rows=598, cols=598 total: nonzeros=35260, allocated nonzeros=35260 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 10368 MPI processes type: chebyshev eigenvalue estimates used: min = 0.147257, max = 1.61982 eigenvalues estimate via cg min 0.0457702, max 1.47257 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 10368 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 10368 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 10368 MPI processes type: mpiaij rows=72415, cols=72415 total: nonzeros=5807437, allocated nonzeros=5807437 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 10368 MPI processes type: chebyshev eigenvalue estimates used: min = 0.138335, max = 1.52169 eigenvalues estimate via cg min 0.0338709, max 1.38335 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 10368 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 10368 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 10368 MPI processes type: mpiaij rows=5775893, cols=5775893 total: nonzeros=263455933, allocated nonzeros=263455933 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 10368 MPI processes type: chebyshev eigenvalue estimates used: min = 0.136582, max = 1.5024 eigenvalues estimate via cg min 0.0328449, max 1.36582 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_4_esteig_) 10368 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 10368 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 10368 MPI processes type: mpiaij rows=278641979, cols=278641979 total: nonzeros=7500100375, allocated nonzeros=55728395800 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 10368 MPI processes type: mpiaij rows=278641979, cols=278641979 total: nonzeros=7500100375, allocated nonzeros=55728395800 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /gpfs/scratch/upc26/upc26229/build_rel_fempar_cell_agg_ompi/FEMPAR/bin/par_test_poisson_unfitted on a arch-linux2-c-opt named s08r2b08 with 10368 processors, by upc26229 Fri Nov 16 15:26:59 2018 Using Petsc Release Version 3.9.0, Apr, 07, 2018 Max Max/Min Avg Total Time (sec): 4.915e+01 1.00001 4.915e+01 Objects: 1.658e+03 1.00181 1.655e+03 Flop: 1.198e+09 1.14246 1.183e+09 1.226e+13 Flop/sec: 2.436e+07 1.14246 2.406e+07 2.494e+11 MPI Messages: 1.789e+05 25.18126 2.640e+04 2.738e+08 MPI Message Lengths: 7.153e+07 2.83776 2.085e+03 5.708e+11 MPI Reductions: 2.568e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 4.9150e+01 100.0% 1.2260e+13 100.0% 2.738e+08 100.0% 2.085e+03 100.0% 2.554e+03 99.5% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSided 12 1.0 8.2886e-02 3.2 0.00e+00 0.0 8.4e+05 8.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSidedF 174 1.0 2.7580e+00 1.9 0.00e+00 0.0 3.7e+06 1.4e+04 0.0e+00 4 0 1 9 0 4 0 1 9 0 0 VecMDot 120 1.0 2.6712e-01 1.3 9.24e+06 1.1 0.0e+00 0.0e+00 1.2e+02 0 1 0 0 5 0 1 0 0 5 351442 VecTDot 336 1.0 9.2127e-01 1.3 8.06e+06 1.1 0.0e+00 0.0e+00 3.4e+02 2 1 0 0 13 2 1 0 0 13 89719 VecNorm 309 1.0 7.2406e-01 1.2 6.12e+06 1.1 0.0e+00 0.0e+00 3.1e+02 1 1 0 0 12 1 1 0 0 12 86499 VecScale 132 1.0 1.2661e-0242.6 9.24e+05 1.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 741513 VecCopy 210 1.0 9.9707e-03 3.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 843 1.0 6.0338e-03 4.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 336 1.0 8.0444e-0211.5 8.06e+06 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1027513 VecAYPX 1491 1.0 3.7490e-02 2.3 1.54e+07 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 4176676 VecAXPBYCZ 672 1.0 2.7469e-02 1.8 2.35e+07 1.1 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 8699658 VecMAXPY 132 1.0 1.9974e-02 4.4 1.09e+07 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 5554863 VecAssemblyBegin 84 1.0 4.3977e-01 2.0 0.00e+00 0.0 5.0e+05 2.6e+03 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAssemblyEnd 84 1.0 4.2681e-04 4.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 132 1.0 4.6056e-03 2.7 9.24e+05 1.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 2038447 VecScatterBegin 1731 1.0 6.6298e-02 3.3 0.00e+00 0.0 2.1e+08 1.0e+03 0.0e+00 0 0 76 37 0 0 0 76 37 0 0 VecScatterEnd 1731 1.0 1.1476e+00 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 0 VecSetRandom 12 1.0 8.2220e-03 4.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 132 1.0 3.1955e-01 1.4 2.77e+06 1.1 0.0e+00 0.0e+00 1.3e+02 1 0 0 0 5 1 0 0 0 5 88139 MatMult 1290 1.0 1.5137e+00 1.4 5.26e+08 1.2 1.7e+08 1.1e+03 0.0e+00 3 44 63 32 0 3 44 63 32 0 3552751 MatMultAdd 168 1.0 4.5490e-01 5.9 9.86e+06 1.2 1.2e+07 1.2e+02 0.0e+00 1 1 4 0 0 1 1 4 0 0 221941 MatMultTranspose 168 1.0 2.4255e-01 7.0 9.86e+06 1.2 1.2e+07 1.2e+02 0.0e+00 0 1 4 0 0 0 1 4 0 0 416246 MatSolve 42 0.0 9.4218e-05 0.0 3.82e+03 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 41 MatSOR 1140 1.0 1.4214e+00 1.2 4.11e+08 1.1 0.0e+00 0.0e+00 0.0e+00 3 34 0 0 0 3 34 0 0 0 2897344 MatCholFctrSym 3 1.0 2.4699e-021549.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 3 1.0 2.2717e-026755.8 2.10e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 12 1.0 4.4884e-02 1.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 36 1.0 3.6266e-0119.9 5.26e+06 1.2 1.5e+06 1.0e+03 0.0e+00 0 0 1 0 0 0 0 1 0 0 148427 MatResidual 168 1.0 2.3626e-01 1.9 6.38e+07 1.2 2.2e+07 1.0e+03 0.0e+00 0 5 8 4 0 0 5 8 4 0 2762345 MatAssemblyBegin 321 1.0 2.4651e+00 2.1 0.00e+00 0.0 3.2e+06 1.6e+04 0.0e+00 4 0 1 9 0 4 0 1 9 0 0 MatAssemblyEnd 321 1.0 5.1716e+00 1.0 0.00e+00 0.0 1.6e+07 4.4e+02 6.5e+02 10 0 6 1 25 10 0 6 1 25 0 MatGetRow 251901 1.1 5.1874e-02 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 3 0.0 5.2588e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 18 1.0 2.6091e+00 1.0 0.00e+00 0.0 8.9e+05 3.5e+02 2.9e+02 5 0 0 0 11 5 0 0 0 11 0 MatGetOrdering 3 0.0 1.3637e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 12 1.0 3.1793e-01 1.2 0.00e+00 0.0 2.4e+07 1.0e+03 2.0e+02 1 0 9 4 8 1 0 9 4 8 0 MatZeroEntries 18 1.0 1.3054e-03 6.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 24 1.3 2.5545e-01 4.5 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 MatAXPY 12 1.0 1.8095e-01 1.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 12 1.0 2.0949e+00 1.1 4.56e+06 1.2 8.7e+06 7.4e+02 1.5e+02 4 0 3 1 6 4 0 3 1 6 22252 MatMatMultSym 12 1.0 1.8619e+00 1.0 0.00e+00 0.0 7.1e+06 6.8e+02 1.4e+02 4 0 3 1 6 4 0 3 1 6 0 MatMatMultNum 12 1.0 9.4086e-02 1.4 4.56e+06 1.2 1.5e+06 1.0e+03 0.0e+00 0 0 1 0 0 0 0 1 0 0 495466 MatPtAP 12 1.0 2.7205e+00 1.0 2.90e+07 1.2 1.3e+07 2.2e+03 1.8e+02 6 2 5 5 7 6 2 5 5 7 108153 MatPtAPSymbolic 12 1.0 1.6082e+00 1.1 0.00e+00 0.0 8.4e+06 2.5e+03 8.4e+01 3 0 3 4 3 3 0 3 4 3 0 MatPtAPNumeric 12 1.0 1.1666e+00 1.1 2.90e+07 1.2 5.1e+06 1.8e+03 9.6e+01 2 2 2 2 4 2 2 2 2 4 252212 MatTrnMatMult 12 1.0 5.1786e+00 1.0 1.32e+08 1.2 9.6e+06 2.7e+04 1.9e+02 11 11 4 45 7 11 11 4 45 7 255034 MatTrnMatMultSym 12 1.0 3.7081e+00 1.0 0.00e+00 0.0 7.6e+06 1.7e+04 1.4e+02 8 0 3 23 5 8 0 3 23 5 0 MatTrnMatMultNum 12 1.0 1.4973e+00 1.0 1.32e+08 1.2 2.0e+06 6.3e+04 4.8e+01 3 11 1 23 2 3 11 1 23 2 882036 MatGetLocalMat 54 1.0 7.5682e-02 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 36 1.0 7.0139e-02 4.1 0.00e+00 0.0 1.1e+07 2.2e+03 0.0e+00 0 0 4 4 0 0 0 4 4 0 0 KSPGMRESOrthog 120 1.0 2.7368e-01 1.3 1.85e+07 1.1 0.0e+00 0.0e+00 1.2e+02 0 2 0 0 5 0 2 0 0 5 686067 KSPSetUp 45 1.0 4.9518e-01 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 1 0 0 0 1 1 0 0 0 1 0 KSPSolve 3 1.0 4.3820e+00 1.0 9.63e+08 1.1 1.8e+08 9.4e+02 5.1e+02 9 80 66 30 20 9 80 66 30 20 2246294 PCGAMGGraph_AGG 12 1.0 2.1963e+00 1.0 4.56e+06 1.2 4.6e+06 6.8e+02 1.4e+02 4 0 2 1 6 4 0 2 1 6 21225 PCGAMGCoarse_AGG 12 1.0 6.2537e+00 1.0 1.32e+08 1.2 4.2e+07 7.1e+03 4.4e+02 13 11 15 53 17 13 11 15 53 17 211189 PCGAMGProl_AGG 12 1.0 2.3437e+00 1.0 0.00e+00 0.0 5.6e+06 1.5e+03 1.9e+02 5 0 2 1 7 5 0 2 1 8 0 PCGAMGPOpt_AGG 12 1.0 3.9482e+00 1.0 7.37e+07 1.1 2.4e+07 9.2e+02 5.0e+02 8 6 9 4 19 8 6 9 4 19 191367 GAMG: createProl 12 1.0 1.4742e+01 1.0 2.10e+08 1.2 7.7e+07 4.4e+03 1.3e+03 30 17 28 58 49 30 17 28 58 50 144003 Graph 24 1.0 2.1900e+00 1.0 4.56e+06 1.2 4.6e+06 6.8e+02 1.4e+02 4 0 2 1 6 4 0 2 1 6 21286 MIS/Agg 12 1.0 3.1835e-01 1.2 0.00e+00 0.0 2.4e+07 1.0e+03 2.0e+02 1 0 9 4 8 1 0 9 4 8 0 SA: col data 12 1.0 6.9797e-01 1.1 0.00e+00 0.0 3.4e+06 2.1e+03 4.8e+01 1 0 1 1 2 1 0 1 1 2 0 SA: frmProl0 12 1.0 1.1243e+00 1.0 0.00e+00 0.0 2.2e+06 5.5e+02 9.6e+01 2 0 1 0 4 2 0 1 0 4 0 SA: smooth 12 1.0 2.4933e+00 1.1 5.26e+06 1.2 8.7e+06 7.4e+02 1.7e+02 5 0 3 1 7 5 0 3 1 7 21589 GAMG: partLevel 12 1.0 7.1589e+00 1.0 2.90e+07 1.2 1.4e+07 2.1e+03 6.4e+02 15 2 5 5 25 15 2 5 5 25 41101 repartition 9 1.0 6.6796e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.4e+01 1 0 0 0 2 1 0 0 0 2 0 Invert-Sort 9 1.0 5.8634e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 1 0 0 0 1 1 0 0 0 1 0 Move A 9 1.0 1.3901e+00 1.0 0.00e+00 0.0 3.3e+05 9.2e+02 1.5e+02 3 0 0 0 6 3 0 0 0 6 0 Move P 9 1.0 1.3170e+00 1.0 0.00e+00 0.0 5.6e+05 2.6e+01 1.5e+02 3 0 0 0 6 3 0 0 0 6 0 PCSetUp 6 1.0 2.2624e+01 1.0 2.39e+08 1.2 9.1e+07 4.0e+03 2.0e+03 46 20 33 64 77 46 20 33 64 77 106836 PCSetUpOnBlocks 42 1.0 2.9893e-0232.3 2.10e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 42 1.0 3.4927e+00 1.1 8.91e+08 1.1 1.7e+08 8.9e+02 3.8e+02 7 74 62 26 15 7 74 62 26 15 2600998 SFSetGraph 12 1.0 1.0152e-023939.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFSetUp 12 1.0 1.0030e-01 2.5 0.00e+00 0.0 2.5e+06 1.4e+03 0.0e+00 0 0 1 1 0 0 0 1 1 0 0 SFBcastBegin 225 1.0 2.3892e-0224.3 0.00e+00 0.0 2.2e+07 9.9e+02 0.0e+00 0 0 8 4 0 0 0 8 4 0 0 SFBcastEnd 225 1.0 4.7889e-0213.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 675 675 28296816 0. Matrix 456 456 597169344 0. Matrix Coarsen 12 12 8208 0. Index Set 294 294 7637280 0. Vec Scatter 102 102 143424 0. Krylov Solver 45 45 415944 0. Preconditioner 33 33 35448 0. Viewer 5 4 3584 0. PetscRandom 24 24 16656 0. Star Forest Graph 12 12 11328 0. ======================================================================================================================== Average time to get PetscTime(): 4.17233e-08 Average time for MPI_Barrier(): 0.00131778 Average time for zero size MPI_Send(): 2.1515e-06 #PETSc Option Table entries: --prefix popcorn3d_full_l3_s6 -beta 7.0 -betaest .true. -check .false. -datadt data_distribution_fully_assembled -dm 3 -in_space .true. -ksp_converged_reason -ksp_max_it 500 -ksp_monitor -ksp_norm_type unpreconditioned -ksp_rtol 1.0e-8 -ksp_type cg -ksp_view -l 1 -levelset popcorn -levelsettol 0.0 -log_view -lsdom -0.1 -mg_coarse_sub_pc_factor_mat_ordering_type nd -mg_coarse_sub_pc_type cholesky -mg_levels_esteig_ksp_type cg -n 720 -no_signal_handler -nruns 3 -pc_gamg_agg_nsmooths 1 -pc_gamg_est_ksp_type cg -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 10 -pc_gamg_type agg -pc_type gamg -petscrc /gpfs/scratch/upc26/upc26229/NEW_STUFF/time_par_cell_agg_ompi.paper/petscrc-0 -tt 1 -uagg .false. -wsolution .false. #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 8 Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 -with-blaslapack-dir=/apps/INTEL/2017.4/mkl --with-debugging=0 --with-x=0 --with-shared-libraries=1 --with-mpi=1 --with-64-bit-indices --download-hypre=../v2.14.0.tar.gz ----------------------------------------- Libraries compiled on 2018-11-07 17:23:07 on login1 Machine characteristics: Linux-4.4.120-92.70-default-x86_64-with-SuSE-12-x86_64 Using PETSc directory: /gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0 Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -wd1572 -g -O3 Using Fortran compiler: mpif90 -fPIC -g -O3 ----------------------------------------- Using include paths: -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/include -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -Wl,-rpath,/apps/INTEL/2017.4/mkl/lib/intel64 -L/apps/INTEL/2017.4/mkl/lib/intel64 -Wl,-rpath,/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -L/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -Wl,-rpath,/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -L/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -Wl,-rpath,/usr/lib64/gcc/x86_64-suse-linux/4.8 -L/usr/lib64/gcc/x86_64-suse-linux/4.8 -Wl,-rpath,/usr/x86_64-suse-linux/lib -L/usr/x86_64-suse-linux/lib -lHYPRE -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lifport -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lpthread -lgcc_s -lirc_s -lstdc++ -ldl ----------------------------------------- Ending run at vie nov 16 15:26:59 CET 2018 -------------- next part -------------- Linear solve converged due to CONVERGED_RTOL iterations 15 KSP Object: 16464 MPI processes type: cg maximum iterations=500, initial guess is zero tolerances: relative=1e-08, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 16464 MPI processes type: gamg type is MULTIPLICATIVE, levels=5 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 10 Number smoothing steps 1 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 16464 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 16464 MPI processes type: bjacobi number of blocks = 16464 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=11, cols=11 package used to perform factorization: petsc total: nonzeros=66, allocated nonzeros=66 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=11, cols=11 total: nonzeros=121, allocated nonzeros=121 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 16464 MPI processes type: mpiaij rows=11, cols=11 total: nonzeros=121, allocated nonzeros=121 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 3 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 16464 MPI processes type: chebyshev eigenvalue estimates used: min = 0.135989, max = 1.49588 eigenvalues estimate via cg min 0.195543, max 1.35989 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 16464 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 16464 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 16464 MPI processes type: mpiaij rows=893, cols=893 total: nonzeros=54833, allocated nonzeros=54833 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 16464 MPI processes type: chebyshev eigenvalue estimates used: min = 0.146943, max = 1.61637 eigenvalues estimate via cg min 0.037861, max 1.46943 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 16464 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 16464 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 16464 MPI processes type: mpiaij rows=114979, cols=114979 total: nonzeros=9348035, allocated nonzeros=9348035 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 16464 MPI processes type: chebyshev eigenvalue estimates used: min = 0.138364, max = 1.52201 eigenvalues estimate via cg min 0.0350501, max 1.38364 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_3_esteig_) 16464 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_3_) 16464 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 16464 MPI processes type: mpiaij rows=9167199, cols=9167199 total: nonzeros=419078313, allocated nonzeros=419078313 total number of mallocs used during MatSetValues calls =0 using nonscalable MatPtAP() implementation not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 16464 MPI processes type: chebyshev eigenvalue estimates used: min = 0.136589, max = 1.50248 eigenvalues estimate via cg min 0.0332852, max 1.36589 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_4_esteig_) 16464 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_4_) 16464 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 16464 MPI processes type: mpiaij rows=442766309, cols=442766309 total: nonzeros=11923049125, allocated nonzeros=88553261800 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 16464 MPI processes type: mpiaij rows=442766309, cols=442766309 total: nonzeros=11923049125, allocated nonzeros=88553261800 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /gpfs/scratch/upc26/upc26229/build_rel_fempar_cell_agg_ompi/FEMPAR/bin/par_test_poisson_unfitted on a arch-linux2-c-opt named s01r2b26 with 16464 processors, by upc26229 Fri Nov 16 18:41:14 2018 Using Petsc Release Version 3.9.0, Apr, 07, 2018 Max Max/Min Avg Total Time (sec): 6.644e+01 1.00006 6.644e+01 Objects: 1.652e+03 1.00182 1.649e+03 Flop: 1.273e+09 1.15475 1.244e+09 2.049e+13 Flop/sec: 1.916e+07 1.15475 1.873e+07 3.084e+11 MPI Messages: 2.616e+05 35.23821 2.868e+04 4.721e+08 MPI Message Lengths: 6.746e+07 2.62396 1.960e+03 9.251e+11 MPI Reductions: 2.616e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 6.6434e+01 100.0% 2.0489e+13 100.0% 4.721e+08 100.0% 1.960e+03 100.0% 2.602e+03 99.5% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSided 12 1.0 1.4701e-01 2.7 0.00e+00 0.0 1.4e+06 8.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSidedF 177 1.0 3.0236e+00 1.6 0.00e+00 0.0 5.8e+06 1.3e+04 0.0e+00 4 0 1 8 0 4 0 1 8 0 0 VecMDot 120 1.0 3.9575e-01 1.6 9.23e+06 1.1 0.0e+00 0.0e+00 1.2e+02 0 1 0 0 5 0 1 0 0 5 376930 VecTDot 342 1.0 1.5654e+00 1.4 8.39e+06 1.1 0.0e+00 0.0e+00 3.4e+02 2 1 0 0 13 2 1 0 0 13 87294 VecNorm 312 1.0 9.9940e-01 1.3 6.29e+06 1.1 0.0e+00 0.0e+00 3.1e+02 1 0 0 0 12 1 0 0 0 12 102238 VecScale 132 1.0 1.4698e-0249.3 9.24e+05 1.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1014961 VecCopy 222 1.0 5.5664e-03 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 888 1.0 3.7285e-03 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 342 1.0 4.5476e-02 6.2 8.39e+06 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 3005001 VecAYPX 1590 1.0 4.0597e-02 2.2 1.64e+07 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 6528161 VecAXPBYCZ 720 1.0 2.9899e-02 1.8 2.52e+07 1.1 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 13607494 VecMAXPY 132 1.0 1.0936e-02 2.4 1.09e+07 1.1 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 16120658 VecAssemblyBegin 84 1.0 5.9557e-01 1.4 0.00e+00 0.0 8.1e+05 2.6e+03 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAssemblyEnd 84 1.0 4.3126e-04 4.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 132 1.0 8.7574e-03 5.9 9.24e+05 1.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1703431 VecScatterBegin 1830 1.0 8.9270e-02 4.4 0.00e+00 0.0 3.6e+08 9.9e+02 0.0e+00 0 0 76 38 0 0 0 76 38 0 0 VecScatterEnd 1830 1.0 1.3435e+00 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 0 VecSetRandom 12 1.0 4.2692e-03 1.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 132 1.0 3.9408e-01 1.4 2.77e+06 1.1 0.0e+00 0.0e+00 1.3e+02 1 0 0 0 5 1 0 0 0 5 113564 MatMult 1365 1.0 1.8317e+00 1.5 5.58e+08 1.2 2.9e+08 1.1e+03 0.0e+00 2 44 62 33 0 2 44 62 33 0 4944033 MatMultAdd 180 1.0 7.1299e-0111.3 1.06e+07 1.2 2.1e+07 1.2e+02 0.0e+00 1 1 5 0 0 1 1 5 0 0 241236 MatMultTranspose 180 1.0 4.6874e-0112.9 1.06e+07 1.2 2.1e+07 1.2e+02 0.0e+00 0 1 5 0 0 0 1 5 0 0 366940 MatSolve 45 0.0 1.2516e-04 0.0 1.04e+04 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 83 MatSOR 1212 1.0 1.5348e+00 1.3 4.36e+08 1.1 0.0e+00 0.0e+00 0.0e+00 2 34 0 0 0 2 34 0 0 0 4532804 MatCholFctrSym 3 1.0 1.2814e-02711.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 3 1.0 9.1130e-032313.2 3.30e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 12 1.0 4.6102e-02 1.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 36 1.0 6.0832e-02 3.4 5.28e+06 1.2 2.5e+06 1.0e+03 0.0e+00 0 0 1 0 0 0 0 1 0 0 1406766 MatResidual 180 1.0 2.6789e-01 1.9 6.87e+07 1.2 3.8e+07 1.0e+03 0.0e+00 0 5 8 4 0 0 5 8 4 0 4149588 MatAssemblyBegin 312 1.0 2.5479e+00 1.8 0.00e+00 0.0 5.0e+06 1.4e+04 0.0e+00 3 0 1 8 0 3 0 1 8 0 0 MatAssemblyEnd 312 1.0 9.6727e+00 1.0 0.00e+00 0.0 2.5e+07 4.4e+02 6.5e+02 14 0 5 1 25 14 0 5 1 25 0 MatGetRow 251874 1.1 3.8604e-02 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 3 0.0 2.7359e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCreateSubMat 18 1.0 5.7689e+00 1.0 0.00e+00 0.0 1.5e+06 3.4e+02 2.9e+02 9 0 0 0 11 9 0 0 0 11 0 MatGetOrdering 3 0.0 7.3662e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 12 1.0 5.6386e-01 1.2 0.00e+00 0.0 4.7e+07 8.7e+02 2.3e+02 1 0 10 4 9 1 0 10 4 9 0 MatZeroEntries 15 1.0 1.4088e-03 6.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 24 1.3 4.0379e-0110.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 MatAXPY 12 1.0 1.9423e-01 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 12 1.0 3.5538e+00 1.0 4.58e+06 1.2 1.4e+07 7.3e+02 1.5e+02 5 0 3 1 6 5 0 3 1 6 20853 MatMatMultSym 12 1.0 3.3264e+00 1.0 0.00e+00 0.0 1.2e+07 6.7e+02 1.4e+02 5 0 2 1 6 5 0 2 1 6 0 MatMatMultNum 12 1.0 9.7088e-02 1.1 4.58e+06 1.2 2.5e+06 1.0e+03 0.0e+00 0 0 1 0 0 0 0 1 0 0 763319 MatPtAP 12 1.0 4.0859e+00 1.0 2.90e+07 1.2 2.2e+07 2.2e+03 1.9e+02 6 2 5 5 7 6 2 5 5 7 114537 MatPtAPSymbolic 12 1.0 2.4298e+00 1.1 0.00e+00 0.0 1.4e+07 2.4e+03 8.4e+01 4 0 3 4 3 4 0 3 4 3 0 MatPtAPNumeric 12 1.0 1.7467e+00 1.1 2.90e+07 1.2 8.2e+06 1.8e+03 9.6e+01 3 2 2 2 4 3 2 2 2 4 267927 MatTrnMatMult 12 1.0 7.1406e+00 1.0 1.35e+08 1.2 1.6e+07 2.6e+04 1.9e+02 11 10 3 44 7 11 10 3 44 7 294270 MatTrnMatMultSym 12 1.0 5.6756e+00 1.0 0.00e+00 0.0 1.3e+07 1.6e+04 1.6e+02 9 0 3 23 6 9 0 3 23 6 0 MatTrnMatMultNum 12 1.0 1.5121e+00 1.0 1.35e+08 1.2 2.5e+06 7.9e+04 2.4e+01 2 10 1 21 1 2 10 1 21 1 1389611 MatGetLocalMat 57 1.0 7.5516e-02 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 36 1.0 6.7780e-02 4.0 0.00e+00 0.0 1.8e+07 2.1e+03 0.0e+00 0 0 4 4 0 0 0 4 4 0 0 KSPGMRESOrthog 120 1.0 4.0225e-01 1.6 1.85e+07 1.1 0.0e+00 0.0e+00 1.2e+02 0 1 0 0 5 0 1 0 0 5 741700 KSPSetUp 45 1.0 8.8595e-01 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 1 0 0 0 1 1 0 0 0 1 0 KSPSolve 3 1.0 5.6979e+00 1.1 1.03e+09 1.1 3.1e+08 9.3e+02 5.2e+02 8 81 66 31 20 8 81 66 31 20 2921121 PCGAMGGraph_AGG 12 1.0 3.3467e+00 1.0 4.58e+06 1.2 7.6e+06 6.7e+02 1.4e+02 5 0 2 1 6 5 0 2 1 6 22144 PCGAMGCoarse_AGG 12 1.0 8.9895e+00 1.0 1.35e+08 1.2 7.7e+07 6.2e+03 4.7e+02 14 10 16 51 18 14 10 16 51 18 233748 PCGAMGProl_AGG 12 1.0 3.9328e+00 1.0 0.00e+00 0.0 9.1e+06 1.5e+03 1.9e+02 6 0 2 1 7 6 0 2 1 7 0 PCGAMGPOpt_AGG 12 1.0 6.3192e+00 1.0 7.43e+07 1.1 3.9e+07 9.0e+02 5.0e+02 9 6 8 4 19 9 6 8 4 19 190048 GAMG: createProl 12 1.0 2.2585e+01 1.0 2.13e+08 1.2 1.3e+08 4.0e+03 1.3e+03 34 16 28 57 50 34 16 28 57 50 149493 Graph 24 1.0 3.3388e+00 1.0 4.58e+06 1.2 7.6e+06 6.7e+02 1.4e+02 5 0 2 1 6 5 0 2 1 6 22196 MIS/Agg 12 1.0 5.6411e-01 1.2 0.00e+00 0.0 4.7e+07 8.7e+02 2.3e+02 1 0 10 4 9 1 0 10 4 9 0 SA: col data 12 1.0 1.2982e+00 1.1 0.00e+00 0.0 5.6e+06 2.0e+03 4.8e+01 2 0 1 1 2 2 0 1 1 2 0 SA: frmProl0 12 1.0 1.6284e+00 1.0 0.00e+00 0.0 3.6e+06 5.5e+02 9.6e+01 2 0 1 0 4 2 0 1 0 4 0 SA: smooth 12 1.0 4.1778e+00 1.0 5.28e+06 1.2 1.4e+07 7.3e+02 1.7e+02 6 0 3 1 7 6 0 3 1 7 20483 GAMG: partLevel 12 1.0 1.3577e+01 1.0 2.90e+07 1.2 2.3e+07 2.1e+03 6.4e+02 20 2 5 5 25 20 2 5 5 25 34470 repartition 9 1.0 1.5048e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.4e+01 2 0 0 0 2 2 0 0 0 2 0 Invert-Sort 9 1.0 1.2282e+00 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 2 0 0 0 1 2 0 0 0 1 0 Move A 9 1.0 2.8930e+00 1.0 0.00e+00 0.0 5.7e+05 8.4e+02 1.5e+02 4 0 0 0 6 4 0 0 0 6 0 Move P 9 1.0 3.0317e+00 1.0 0.00e+00 0.0 9.4e+05 2.5e+01 1.5e+02 5 0 0 0 6 5 0 0 0 6 0 PCSetUp 6 1.0 3.7378e+01 1.0 2.41e+08 1.2 1.6e+08 3.7e+03 2.0e+03 56 19 33 62 77 56 19 33 62 77 102849 PCSetUpOnBlocks 45 1.0 3.1359e-02137.6 3.30e+01 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 45 1.0 4.5105e+00 1.1 9.54e+08 1.1 2.9e+08 8.7e+02 3.8e+02 7 75 62 28 15 7 75 62 28 15 3403589 SFSetGraph 12 1.0 8.7668e-033502.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFSetUp 12 1.0 1.6941e-01 2.3 0.00e+00 0.0 4.2e+06 1.4e+03 0.0e+00 0 0 1 1 0 0 0 1 1 0 0 SFBcastBegin 255 1.0 2.5270e-0229.0 0.00e+00 0.0 4.3e+07 8.3e+02 0.0e+00 0 0 9 4 0 0 0 9 4 0 0 SFBcastEnd 255 1.0 1.0079e-0128.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 675 675 28316856 0. Matrix 450 450 596794296 0. Matrix Coarsen 12 12 8208 0. Index Set 294 294 11758608 0. Vec Scatter 102 102 143424 0. Krylov Solver 45 45 415944 0. Preconditioner 33 33 35448 0. Viewer 5 4 3584 0. PetscRandom 24 24 16656 0. Star Forest Graph 12 12 11328 0. ======================================================================================================================== Average time to get PetscTime(): 4.15836e-08 Average time for MPI_Barrier(): 3.40383e-05 Average time for zero size MPI_Send(): 1.88055e-06 #PETSc Option Table entries: --prefix popcorn3d_full_l3_s7 -beta 7.0 -betaest .true. -check .false. -datadt data_distribution_fully_assembled -dm 3 -in_space .true. -ksp_converged_reason -ksp_max_it 500 -ksp_monitor -ksp_norm_type unpreconditioned -ksp_rtol 1.0e-8 -ksp_type cg -ksp_view -l 1 -levelset popcorn -levelsettol 0.0 -log_view -lsdom -0.1 -mg_coarse_sub_pc_factor_mat_ordering_type nd -mg_coarse_sub_pc_type cholesky -mg_levels_esteig_ksp_type cg -n 840 -no_signal_handler -nruns 3 -pc_gamg_agg_nsmooths 1 -pc_gamg_est_ksp_type cg -pc_gamg_process_eq_limit 50 -pc_gamg_square_graph 10 -pc_gamg_type agg -pc_type gamg -petscrc /gpfs/scratch/upc26/upc26229/NEW_STUFF/time_par_cell_agg_ompi.paper/petscrc-0 -tt 1 -uagg .false. -wsolution .false. #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 8 Configure options: --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 -with-blaslapack-dir=/apps/INTEL/2017.4/mkl --with-debugging=0 --with-x=0 --with-shared-libraries=1 --with-mpi=1 --with-64-bit-indices --download-hypre=../v2.14.0.tar.gz ----------------------------------------- Libraries compiled on 2018-11-07 17:23:07 on login1 Machine characteristics: Linux-4.4.120-92.70-default-x86_64-with-SuSE-12-x86_64 Using PETSc directory: /gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0 Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -wd1572 -g -O3 Using Fortran compiler: mpif90 -fPIC -g -O3 ----------------------------------------- Using include paths: -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/include -I/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -L/gpfs/scratch/upc26/upc26229/petsc_cell_agg_openmpi/release/petsc-3.9.0/arch-linux2-c-opt/lib -Wl,-rpath,/apps/INTEL/2017.4/mkl/lib/intel64 -L/apps/INTEL/2017.4/mkl/lib/intel64 -Wl,-rpath,/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -L/usr/mpi/intel/openmpi-1.10.4-hfi/lib64 -Wl,-rpath,/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -L/gpfs/apps/MN4/INTEL/2018.0.128/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin -Wl,-rpath,/usr/lib64/gcc/x86_64-suse-linux/4.8 -L/usr/lib64/gcc/x86_64-suse-linux/4.8 -Wl,-rpath,/usr/x86_64-suse-linux/lib -L/usr/x86_64-suse-linux/lib -lHYPRE -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lifport -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lpthread -lgcc_s -lirc_s -lstdc++ -ldl ----------------------------------------- Ending run at vie nov 16 18:41:15 CET 2018 -------------- next part -------------- [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=127, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=580, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=211, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=97, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=508, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=220, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=211, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1012, tag=0 (context=e8, rank=127, tag=7ffffd8d, sreq=0x2d580b0)[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=211, tag=7ffffd8d, sreq=0x2fda8e0) (context=e8, rank=97, tag=7ffffd8d, sreq=0x44b02f0) (context=e8, rank=508, tag=7ffffd8d, sreq=0x3f55830)[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=133, tag=7ffffd8d, sreq=0x2d40fb0) (context=e8, rank=580, tag=7ffffd8d, sreq=0x5b1d140) (context=e8, rank=355, tag=7ffffd8d, sreq=0x2fc7b60)[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=100, tag=7ffffd8d, sreq=0x44b1df0) (context=e8, rank=652, tag=7ffffd8d, sreq=0x3f47bb0) (context=e8, rank=211, tag=7ffffd8d, sreq=0x4689c60) (context=e8, rank=1012, tag=7ffffd8d, sreq=0x5adf1f0) (context=e8, rank=724, tag=7ffffd8d, sreq=0x5b1ccc0)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=211, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=211, tag=7ffffd8d, sreq=0x321db60) (context=e8, rank=355, tag=7ffffd8d, sreq=0x32122e0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f9560ff2640, &ssend_ack_recv_buffer=0x7f9560ff2648, &pad2=0x7f9560ff2658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=220, tag=7ffffd8d, sreq=0x3c6b1b0) (context=e8, rank=1156, tag=7ffffd8d, sreq=0x5ae42f0)MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f62a6443640, &ssend_ack_recv_buffer=0x7f62a6443648, &pad2=0x7f62a6443658 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=796, tag=0 (context=e8, rank=364, tag=7ffffd8d, sreq=0x3c76eb0)MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f460eef5640, &ssend_ack_recv_buffer=0x7f460eef5648, &pad2=0x7f460eef5658 MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f089b5ae640, &ssend_ack_recv_buffer=0x7f089b5ae648, &pad2=0x7f089b5ae658 MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f7017aab640, &ssend_ack_recv_buffer=0x7f7017aab648, &pad2=0x7f7017aab658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=283, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=958, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=817, tag=7ffffd8d, sreq=0x5b7d7b0) (context=e8, rank=820, tag=7ffffd8d, sreq=0x5b7c130) (context=e8, rank=958, tag=7ffffd8d, sreq=0x5b8ab30) (context=e8, rank=961, tag=7ffffd8d, sreq=0x5b75e30) (context=e8, rank=964, tag=7ffffd8d, sreq=0x5b8afb0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f3e00a0d640, &ssend_ack_recv_buffer=0x7f3e00a0d648, &pad2=0x7f3e00a0d658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f2f61f9e640, &ssend_ack_recv_buffer=0x7f2f61f9e648, &pad2=0x7f2f61f9e658 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=868, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=955, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f9c131ed640, &ssend_ack_recv_buffer=0x7f9c131ed648, &pad2=0x7f9c131ed658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1051, tag=0 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=715, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=709, tag=7ffffd8d, sreq=0x539bf30) (context=e8, rank=715, tag=7ffffd8d, sreq=0x539bab0) (context=e8, rank=853, tag=7ffffd8d, sreq=0x53acd30) (context=e8, rank=931, tag=7ffffd8d, sreq=0x5386930) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f37276cf640, &ssend_ack_recv_buffer=0x7f37276cf648, &pad2=0x7f37276cf658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=283, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=364, tag=0 MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fa5e6fa0640, &ssend_ack_recv_buffer=0x7fa5e6fa0648, &pad2=0x7fa5e6fa0658 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=787, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=643, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=499, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=355, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=349, tag=7ffffd8d, sreq=0x41fa2e0) (context=e8, rank=355, tag=7ffffd8d, sreq=0x41f99e0) (context=e8, rank=493, tag=7ffffd8d, sreq=0x4212ee0) (context=e8, rank=571, tag=7ffffd8d, sreq=0x42113e0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fbb1cbb1640, &ssend_ack_recv_buffer=0x7fbb1cbb1648, &pad2=0x7fbb1cbb1658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=571, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=148, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1066, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=283, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=277, tag=7ffffd8d, sreq=0x3d48d30) (context=e8, rank=283, tag=7ffffd8d, sreq=0x3d491b0) (context=e8, rank=421, tag=7ffffd8d, sreq=0x3d557b0) (context=e8, rank=427, tag=7ffffd8d, sreq=0x3d52630) (context=e8, rank=499, tag=7ffffd8d, sreq=0x3d46030) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f9dd7f8a640, &ssend_ack_recv_buffer=0x7f9dd7f8a648, &pad2=0x7f9dd7f8a658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=859, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=859, tag=7ffffd8d, sreq=0x6c1f9f0) (context=e8, rank=1003, tag=7ffffd8d, sreq=0x6c0de70) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fb5fd466640, &ssend_ack_recv_buffer=0x7fb5fd466648, &pad2=0x7fb5fd466658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=958, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=958, tag=7ffffd8d, sreq=0x51f0f80) (context=e8, rank=961, tag=7ffffd8d, sreq=0x51d9100) (context=e8, rank=964, tag=7ffffd8d, sreq=0x51ef480) (context=e8, rank=1102, tag=7ffffd8d, sreq=0x51d6d00) (context=e8, rank=1105, tag=7ffffd8d, sreq=0x51db980) (context=e8, rank=1108, tag=7ffffd8d, sreq=0x51e6480) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f78d4ec9640, &ssend_ack_recv_buffer=0x7f78d4ec9648, &pad2=0x7f78d4ec9658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=796, tag=7ffffd8d, sreq=0x5ee11b0)MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fa8cb4c5640, &ssend_ack_recv_buffer=0x7fa8cb4c5648, &pad2=0x7fa8cb4c5658 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=517, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=511, tag=7ffffd8d, sreq=0x44405b0) (context=e8, rank=514, tag=7ffffd8d, sreq=0x442fc30) (context=e8, rank=517, tag=7ffffd8d, sreq=0x44324b0) (context=e8, rank=655, tag=7ffffd8d, sreq=0x443b030) (context=e8, rank=658, tag=7ffffd8d, sreq=0x44495b0) (context=e8, rank=661, tag=7ffffd8d, sreq=0x4432930) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7feceff86640, &ssend_ack_recv_buffer=0x7feceff86648, &pad2=0x7feceff86658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=715, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=205, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=205, tag=7ffffd8d, sreq=0x40290e0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7ff6c54a7640, &ssend_ack_recv_buffer=0x7ff6c54a7648, &pad2=0x7ff6c54a7658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=931, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=787, tag=7ffffd8d, sreq=0x5ebb0e0) (context=e8, rank=931, tag=7ffffd8d, sreq=0x5ea12e0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f0e5d848640, &ssend_ack_recv_buffer=0x7f0e5d848648, &pad2=0x7f0e5d848658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=499, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=499, tag=7ffffd8d, sreq=0x4de0470) (context=e8, rank=643, tag=7ffffd8d, sreq=0x4dd03f0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fa65c639640, &ssend_ack_recv_buffer=0x7fa65c639648, &pad2=0x7fa65c639658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=355, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1039, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=1039, tag=7ffffd8d, sreq=0x5228150) (context=e8, rank=1042, tag=7ffffd8d, sreq=0x5228a50) (context=e8, rank=1045, tag=7ffffd8d, sreq=0x52165d0) (context=e8, rank=1183, tag=7ffffd8d, sreq=0x521f5d0) (context=e8, rank=1186, tag=7ffffd8d, sreq=0x5231150) (context=e8, rank=1189, tag=7ffffd8d, sreq=0x521fed0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f48b9ca1640, &ssend_ack_recv_buffer=0x7f48b9ca1648, &pad2=0x7f48b9ca1658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=571, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=355, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=205, tag=7ffffd8d, sreq=0x42512e0) (context=e8, rank=349, tag=7ffffd8d, sreq=0x4273360) (context=e8, rank=355, tag=7ffffd8d, sreq=0x4270ae0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fdb1fecb640, &ssend_ack_recv_buffer=0x7fdb1fecb648, &pad2=0x7fdb1fecb658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=907, tag=7ffffd8d, sreq=0x3bdf250)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=526, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=523, tag=7ffffd8d, sreq=0x3746ab0) (context=e8, rank=526, tag=7ffffd8d, sreq=0x3748130) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7ff6486d1640, &ssend_ack_recv_buffer=0x7ff6486d1648, &pad2=0x7ff6486d1658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=715, tag=7ffffd8d, sreq=0x6db92f0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fd3a4a9e640, &ssend_ack_recv_buffer=0x7fd3a4a9e648, &pad2=0x7fd3a4a9e658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=277, tag=7ffffd8d, sreq=0x2b56db0) (context=e8, rank=364, tag=7ffffd8d, sreq=0x38f54b0)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1054, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=787, tag=7ffffd8d, sreq=0x5a93f20) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f47122dc640, &ssend_ack_recv_buffer=0x7f47122dc648, &pad2=0x7f47122dc658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=643, tag=7ffffd8d, sreq=0x5314c60)[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=355, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=355, tag=7ffffd8d, sreq=0x44e1560) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7ff774d71640, &ssend_ack_recv_buffer=0x7ff774d71648, &pad2=0x7ff774d71658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=421, tag=7ffffd8d, sreq=0x3409030) (context=e8, rank=148, tag=7ffffd8d, sreq=0x38553e0)[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=283, tag=7ffffd8d, sreq=0x4513d30)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=571, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=571, tag=7ffffd8d, sreq=0x62bee40) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f635ece7640, &ssend_ack_recv_buffer=0x7f635ece7648, &pad2=0x7f635ece7658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=868, tag=7ffffd8d, sreq=0x584cf70)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=985, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=979, tag=7ffffd8d, sreq=0x38b7e00) (context=e8, rank=982, tag=7ffffd8d, sreq=0x38c5f00) (context=e8, rank=985, tag=7ffffd8d, sreq=0x38d2080) (context=e8, rank=1123, tag=7ffffd8d, sreq=0x38d1c00) (context=e8, rank=1126, tag=7ffffd8d, sreq=0x38d3700) (context=e8, rank=1129, tag=7ffffd8d, sreq=0x38c7580) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f83e9947640, &ssend_ack_recv_buffer=0x7f83e9947648, &pad2=0x7f83e9947658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fd7d2b2b640, &ssend_ack_recv_buffer=0x7fd7d2b2b648, &pad2=0x7fd7d2b2b658 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=343, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=196, tag=7ffffd8d, sreq=0x318d460) (context=e8, rank=199, tag=7ffffd8d, sreq=0x3177560) (context=e8, rank=202, tag=7ffffd8d, sreq=0x317afe0) (context=e8, rank=340, tag=7ffffd8d, sreq=0x318c6e0) (context=e8, rank=343, tag=7ffffd8d, sreq=0x318f860) (context=e8, rank=346, tag=7ffffd8d, sreq=0x3197660) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f48361f8640, &ssend_ack_recv_buffer=0x7f48361f8648, &pad2=0x7f48361f8658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=223, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=223, tag=7ffffd8d, sreq=0x3aaf730) (context=e8, rank=367, tag=7ffffd8d, sreq=0x3ac3b30) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7ff7ff3e6640, &ssend_ack_recv_buffer=0x7ff7ff3e6648, &pad2=0x7ff7ff3e6658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=910, tag=7ffffd8d, sreq=0x3bdedd0)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=685, tag=0 (context=e8, rank=283, tag=7ffffd8d, sreq=0x2b6bf30) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1003, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=1003, tag=7ffffd8d, sreq=0x6c5a3b0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f3ce6ad5640, &ssend_ack_recv_buffer=0x7f3ce6ad5648, &pad2=0x7f3ce6ad5658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=781, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=781, tag=7ffffd8d, sreq=0x7083320) (context=e8, rank=925, tag=7ffffd8d, sreq=0x706f820) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fc6fad20640, &ssend_ack_recv_buffer=0x7fc6fad20648, &pad2=0x7fc6fad20658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=787, tag=7ffffd8d, sreq=0x5321fe0)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=643, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=493, tag=7ffffd8d, sreq=0x33118f0) (context=e8, rank=637, tag=7ffffd8d, sreq=0x3329bf0) (context=e8, rank=643, tag=7ffffd8d, sreq=0x53c3500) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f4fccb25640, &ssend_ack_recv_buffer=0x7f4fccb25648, &pad2=0x7f4fccb25658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=355, tag=7ffffd8d, sreq=0x35250e0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fa20945f640, &ssend_ack_recv_buffer=0x7fa20945f648, &pad2=0x7fa20945f658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=565, tag=7ffffd8d, sreq=0x3427ab0) (context=e8, rank=1066, tag=7ffffd8d, sreq=0x3c18980) (context=e8, rank=427, tag=7ffffd8d, sreq=0x451f5b0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7faa3aebb640, &ssend_ack_recv_buffer=0x7faa3aebb648, &pad2=0x7faa3aebb658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=571, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=571, tag=7ffffd8d, sreq=0x48ef0d0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f4476827640, &ssend_ack_recv_buffer=0x7f4476827648, &pad2=0x7f4476827658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=955, tag=7ffffd8d, sreq=0x3b650d0)MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=379, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=235, tag=7ffffd8d, sreq=0x482c1c0) (context=e8, rank=238, tag=7ffffd8d, sreq=0x482e5c0) (context=e8, rank=241, tag=7ffffd8d, sreq=0x4823f40) (context=e8, rank=379, tag=7ffffd8d, sreq=0x4834d40) (context=e8, rank=382, tag=7ffffd8d, sreq=0x482ea40) (context=e8, rank=385, tag=7ffffd8d, sreq=0x6ad7440) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f2d5bb45640, &ssend_ack_recv_buffer=0x7f2d5bb45648, &pad2=0x7f2d5bb45658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=913, tag=7ffffd8d, sreq=0x3bd30d0)[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=355, tag=7ffffd8d, sreq=0x2b66530)MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f2eefed4640, &ssend_ack_recv_buffer=0x7f2eefed4648, &pad2=0x7f2eefed4658 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1003, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=1003, tag=7ffffd8d, sreq=0x643b5b0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f31d5a6a640, &ssend_ack_recv_buffer=0x7f31d5a6a648, &pad2=0x7f31d5a6a658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=499, tag=7ffffd8d, sreq=0x3f24d70) (context=e8, rank=643, tag=7ffffd8d, sreq=0x3f35ff0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f98486d5640, &ssend_ack_recv_buffer=0x7f98486d5648, &pad2=0x7f98486d5658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=571, tag=7ffffd8d, sreq=0x3428cb0) (context=e8, rank=1204, tag=7ffffd8d, sreq=0x3c0b180)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=418, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=277, tag=7ffffd8d, sreq=0x488a1b0) (context=e8, rank=418, tag=7ffffd8d, sreq=0x486b730) (context=e8, rank=421, tag=7ffffd8d, sreq=0x488c130) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fa49b2c0640, &ssend_ack_recv_buffer=0x7fa49b2c0648, &pad2=0x7fa49b2c0658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=571, tag=7ffffd8d, sreq=0x6313130) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f7e703f8640, &ssend_ack_recv_buffer=0x7f7e703f8648, &pad2=0x7f7e703f8658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f62d2832640, &ssend_ack_recv_buffer=0x7f62d2832648, &pad2=0x7f62d2832658 (context=e8, rank=958, tag=7ffffd8d, sreq=0x3b71b50)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=676, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=673, tag=7ffffd8d, sreq=0x5c07e30) (context=e8, rank=676, tag=7ffffd8d, sreq=0x5c03f30) (context=e8, rank=817, tag=7ffffd8d, sreq=0x5c16cb0) (context=e8, rank=820, tag=7ffffd8d, sreq=0x5bfceb0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fbafd7b9640, &ssend_ack_recv_buffer=0x7fbafd7b9648, &pad2=0x7fbafd7b9658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=382, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=238, tag=7ffffd8d, sreq=0x41b98b0) (context=e8, rank=241, tag=7ffffd8d, sreq=0x41d5ab0) (context=e8, rank=244, tag=7ffffd8d, sreq=0x41cafb0) (context=e8, rank=382, tag=7ffffd8d, sreq=0x41def30) (context=e8, rank=385, tag=7ffffd8d, sreq=0x41db4b0) (context=e8, rank=388, tag=7ffffd8d, sreq=0x41e0a30) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f1e6287c640, &ssend_ack_recv_buffer=0x7f1e6287c648, &pad2=0x7f1e6287c658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=1051, tag=7ffffd8d, sreq=0x3be9450)MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f4630746640, &ssend_ack_recv_buffer=0x7f4630746648, &pad2=0x7f4630746658 MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f61e4836640, &ssend_ack_recv_buffer=0x7f61e4836648, &pad2=0x7f61e4836658 (context=e8, rank=1207, tag=7ffffd8d, sreq=0x3c25d00)MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=961, tag=7ffffd8d, sreq=0x3b66750)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=370, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=232, tag=7ffffd8d, sreq=0x35ac730) (context=e8, rank=370, tag=7ffffd8d, sreq=0x35c1430) (context=e8, rank=373, tag=7ffffd8d, sreq=0x35c2630) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f1ee1fe7640, &ssend_ack_recv_buffer=0x7f1ee1fe7648, &pad2=0x7f1ee1fe7658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=1054, tag=7ffffd8d, sreq=0x3be7dd0) (context=e8, rank=982, tag=7ffffd8d, sreq=0x64c6b70) (context=e8, rank=985, tag=7ffffd8d, sreq=0x64afef0) (context=e8, rank=988, tag=7ffffd8d, sreq=0x64c3e70) (context=e8, rank=1054, tag=7ffffd8d, sreq=0x64adaf0) (context=e8, rank=1126, tag=7ffffd8d, sreq=0x64c1170) (context=e8, rank=1129, tag=7ffffd8d, sreq=0x64d47f0) (context=e8, rank=1132, tag=7ffffd8d, sreq=0x64d7070) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f9951f45640, &ssend_ack_recv_buffer=0x7f9951f45648, &pad2=0x7f9951f45658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=415, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=409, tag=7ffffd8d, sreq=0x4028230) (context=e8, rank=415, tag=7ffffd8d, sreq=0x4026730) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fb0a66d1640, &ssend_ack_recv_buffer=0x7fb0a66d1648, &pad2=0x7fb0a66d1658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=1210, tag=7ffffd8d, sreq=0x3c1c880)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=556, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=550, tag=7ffffd8d, sreq=0x6c95040) (context=e8, rank=553, tag=7ffffd8d, sreq=0x6c8f1c0) (context=e8, rank=556, tag=7ffffd8d, sreq=0x6c7c440) (context=e8, rank=622, tag=7ffffd8d, sreq=0x6c95dc0) (context=e8, rank=694, tag=7ffffd8d, sreq=0x6c7fa40) (context=e8, rank=697, tag=7ffffd8d, sreq=0x6c7dac0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7ff4ec6dc640, &ssend_ack_recv_buffer=0x7ff4ec6dc648, &pad2=0x7ff4ec6dc658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1030, tag=0 (context=e8, rank=1099, tag=7ffffd8d, sreq=0x3b7cf50)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=394, tag=0 (context=e8, rank=541, tag=7ffffd8d, sreq=0x540b9c0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=991, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=991, tag=7ffffd8d, sreq=0x5eaec30) (context=e8, rank=994, tag=7ffffd8d, sreq=0x5eaf530) (context=e8, rank=997, tag=7ffffd8d, sreq=0x5ead5b0) (context=e8, rank=1135, tag=7ffffd8d, sreq=0x5e9b5b0) (context=e8, rank=1138, tag=7ffffd8d, sreq=0x5ebbfb0) (context=e8, rank=1141, tag=7ffffd8d, sreq=0x5eac830) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f5cd28bd640, &ssend_ack_recv_buffer=0x7f5cd28bd648, &pad2=0x7f5cd28bd658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=637, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=637, tag=7ffffd8d, sreq=0x6d4d260) (context=e8, rank=781, tag=7ffffd8d, sreq=0x6d60460) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f76d63dd640, &ssend_ack_recv_buffer=0x7f76d63dd648, &pad2=0x7f76d63dd658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fa94e9c0640, &ssend_ack_recv_buffer=0x7fa94e9c0648, &pad2=0x7fa94e9c0658 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=697, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=556, tag=7ffffd8d, sreq=0x5b665a0) (context=e8, rank=697, tag=7ffffd8d, sreq=0x5b734a0) (context=e8, rank=703, tag=7ffffd8d, sreq=0x5b8e020) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f53f49e6640, &ssend_ack_recv_buffer=0x7f53f49e6648, &pad2=0x7f53f49e6658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1033, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=883, tag=7ffffd8d, sreq=0x3a55a80) (context=e8, rank=886, tag=7ffffd8d, sreq=0x3a4e580) (context=e8, rank=889, tag=7ffffd8d, sreq=0x3a61c00) (context=e8, rank=961, tag=7ffffd8d, sreq=0x3a54d00) (context=e8, rank=1027, tag=7ffffd8d, sreq=0x3a6be00) (context=e8, rank=1030, tag=7ffffd8d, sreq=0x3a6e200) (context=e8, rank=1033, tag=7ffffd8d, sreq=0x3a6c700) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fdac7cca640, &ssend_ack_recv_buffer=0x7fdac7cca648, &pad2=0x7fdac7cca658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=1102, tag=7ffffd8d, sreq=0x3b6d7d0)[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=250, tag=7ffffd8d, sreq=0x35651e0) (context=e8, rank=253, tag=7ffffd8d, sreq=0x354e9e0) (context=e8, rank=256, tag=7ffffd8d, sreq=0x3558760) (context=e8, rank=394, tag=7ffffd8d, sreq=0x356e660) (context=e8, rank=397, tag=7ffffd8d, sreq=0x35705e0) (context=e8, rank=400, tag=7ffffd8d, sreq=0x3557e60) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7ff3039cd640, &ssend_ack_recv_buffer=0x7ff3039cd648, &pad2=0x7ff3039cd658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fcd9734e640, &ssend_ack_recv_buffer=0x7fcd9734e648, &pad2=0x7fcd9734e658 (context=e8, rank=544, tag=7ffffd8d, sreq=0x53f5640)MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f30e3e7f640, &ssend_ack_recv_buffer=0x7f30e3e7f648, &pad2=0x7f30e3e7f658 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=988, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=988, tag=7ffffd8d, sreq=0x5287b70) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f18da01f640, &ssend_ack_recv_buffer=0x7f18da01f648, &pad2=0x7f18da01f658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f6e0e0d2640, &ssend_ack_recv_buffer=0x7f6e0e0d2648, &pad2=0x7f6e0e0d2658 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=880, tag=7ffffd8d, sreq=0x5e270a0) (context=e8, rank=883, tag=7ffffd8d, sreq=0x5e3aba0) (context=e8, rank=886, tag=7ffffd8d, sreq=0x5e21b20) (context=e8, rank=1024, tag=7ffffd8d, sreq=0x5e231a0) (context=e8, rank=1027, tag=7ffffd8d, sreq=0x5e483a0) (context=e8, rank=1030, tag=7ffffd8d, sreq=0x5e324a0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f7956b65640, &ssend_ack_recv_buffer=0x7f7956b65648, &pad2=0x7f7956b65658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=1105, tag=7ffffd8d, sreq=0x3b73ad0)MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=547, tag=7ffffd8d, sreq=0x53f63c0)MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=418, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=412, tag=7ffffd8d, sreq=0x3f666b0) (context=e8, rank=418, tag=7ffffd8d, sreq=0x3f6e4b0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fe520611640, &ssend_ack_recv_buffer=0x7fe520611648, &pad2=0x7fe520611658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1003, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=853, tag=7ffffd8d, sreq=0x5a32570) (context=e8, rank=997, tag=7ffffd8d, sreq=0x5a1cf70) (context=e8, rank=1003, tag=7ffffd8d, sreq=0x5a2e1f0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f8db2022640, &ssend_ack_recv_buffer=0x7f8db2022648, &pad2=0x7f8db2022658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=970, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=970, tag=7ffffd8d, sreq=0x5e5eda0) (context=e8, rank=973, tag=7ffffd8d, sreq=0x5e59ca0) (context=e8, rank=976, tag=7ffffd8d, sreq=0x5e44fa0) (context=e8, rank=1114, tag=7ffffd8d, sreq=0x5e52c20) (context=e8, rank=1117, tag=7ffffd8d, sreq=0x5e5c0a0) (context=e8, rank=1120, tag=7ffffd8d, sreq=0x5e69420) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fb7b57c3640, &ssend_ack_recv_buffer=0x7fb7b57c3648, &pad2=0x7fb7b57c3658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=919, tag=0 (context=e8, rank=685, tag=7ffffd8d, sreq=0x54032c0)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=268, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=418, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=418, tag=7ffffd8d, sreq=0x4540ab0) (context=e8, rank=421, tag=7ffffd8d, sreq=0x453fd30) (context=e8, rank=565, tag=7ffffd8d, sreq=0x45491b0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f1c32ffc640, &ssend_ack_recv_buffer=0x7f1c32ffc648, &pad2=0x7f1c32ffc658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=688, tag=7ffffd8d, sreq=0x53f1740)[MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list:MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f494cd94640, &ssend_ack_recv_buffer=0x7f494cd94648, &pad2=0x7f494cd94658 (context=e8, rank=916, tag=7ffffd8d, sreq=0x5d697a0) (context=e8, rank=691, tag=7ffffd8d, sreq=0x53f2940)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=271, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=271, tag=7ffffd8d, sreq=0x30225b0) (context=e8, rank=277, tag=7ffffd8d, sreq=0x302b130) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f6a3257c640, &ssend_ack_recv_buffer=0x7f6a3257c648, &pad2=0x7f6a3257c658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=919, tag=7ffffd8d, sreq=0x5d68120)[MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=532, tag=0 [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=532, tag=7ffffd8d, sreq=0x6a31720) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f887ff24640, &ssend_ack_recv_buffer=0x7f887ff24648, &pad2=0x7f887ff24658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) (context=e8, rank=268, tag=7ffffd8d, sreq=0x2e88ab0) (context=e8, rank=271, tag=7ffffd8d, sreq=0x2e8aeb0) (context=e8, rank=274, tag=7ffffd8d, sreq=0x2e80cb0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7fc286885640, &ssend_ack_recv_buffer=0x7fc286885648, &pad2=0x7fc286885658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: ERROR: can not find matching ssend. context=e8, rank=1117, tag=0 MPID_nem_tmi_handle_ssend_ack: &pad1=0x7efeef587640, &ssend_ack_recv_buffer=0x7efeef587648, &pad2=0x7efeef587658 MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f722c9cf640, &ssend_ack_recv_buffer=0x7f722c9cf648, &pad2=0x7f722c9cf658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) [MPID_nem_tmi_pending_ssend_dequeue]: pending ssend list: (context=e8, rank=976, tag=7ffffd8d, sreq=0x67744a0) (context=e8, rank=1117, tag=7ffffd8d, sreq=0x6775b20) (context=e8, rank=1123, tag=7ffffd8d, sreq=0x676eaa0) MPID_nem_tmi_handle_ssend_ack: &pad1=0x7f8181246640, &ssend_ack_recv_buffer=0x7f8181246648, &pad2=0x7f8181246658 MPID_nem_tmi_handle_ssend_ack: pad1=0, pad2=0 (both should be 0) =================================================================================== = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES = PID 455075 RUNNING AT s08r1b31 = EXIT CODE: 9 = CLEANING UP REMAINING PROCESSES = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES =================================================================================== Intel(R) MPI Library troubleshooting guide: https://software.intel.com/node/561764 =================================================================================== -------------- next part -------------- [424]PETSC ERROR: ------------------------------------------------------------------------ [424]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [424]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [424]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [424]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [726]PETSC ERROR: [424]PETSC ERROR: likely location of problem given in stack below [424]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [424]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [424]PETSC ERROR: INSTEAD the line number of the start of the function [424]PETSC ERROR: is given. [424]PETSC ERROR: [424] PetscCommBuildTwoSidedFReq_Ibarrier line 367 /gpfs/scratch/upc26/upc26229/petsc_cell_agg/debug/petsc-3.9.0/src/sys/utils/mpits.c [424]PETSC ERROR: [424] PetscCommBuildTwoSidedFReq line 548 /gpfs/scratch/upc26/upc26229/petsc_cell_agg/debug/petsc-3.9.0/src/sys/utils/mpits.c [726]PETSC ERROR: [424]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [424]PETSC ERROR: Signal received [424]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [424]PETSC ERROR: Petsc Release Version 3.9.0, Apr, 07, 2018 [424]PETSC ERROR: [1693]PETSC ERROR: ------------------------------------------------------------------------ [424]PETSC ERROR: Configure options --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpiifort -with-blaslapack-dir=/apps/INTEL/2017.4/mkl --with-debugging --with-x=0 --with-shared-libraries=1 --with-mpi=1 --with-64-bit-indices --CFLAGS=-traceback --FFLAGS=-traceback [424]PETSC ERROR: #1 User provided function() line 0 in unknown file [726]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 424 orrtl: severe (174): SIGSEGV, segmentation fault occurred Image PC Routine Line Source libpetsc.so.3.9.0 00007F4608388A9C for__signal_handl Unknown Unknown libpthread-2.22.s 00007F460DC5DB10 Unknown Unknown Unknown libmpi.so.12.0 00007F460E3D1CFF Unknown Unknown Unknown libmpi.so.12.0 00007F460E736050 Unknown Unknown Unknown libmpi.so.12.0 00007F460E5EABD0 Unknown Unknown Unknown libmpi.so.12 00007F460E3AEBC8 PMPIDI_CH3I_Progr Unknown Unknown libmpi.so.12 00007F460E72B90C MPI_Testall Unknown Unknown libpetsc.so.3.9.0 00007F4607391FFE PetscCommBuildTwo Unknown Unknown libpetsc.so.3.9.0 00007F460793A613 Unknown Unknown Unknown libpetsc.so.3.9.0 00007F4607936EC4 Unknown Unknown Unknown libpetsc.so.3.9.0 00007F46078AEC47 MatAssemblyBegin_ Unknown Unknown libpetsc.so.3.9.0 00007F4607587E98 MatAssemblyBegin Unknown Unknown libpetsc.so.3.9.0 00007F4607921A9D Unknown Unknown Unknown libpetsc.so.3.9.0 00007F460791A557 Unknown Unknown Unknown libpetsc.so.3.9.0 00007F4607598DA8 MatTransposeMatMu Unknown Unknown libpetsc.so.3.9.0 00007F4607DC7AE0 Unknown Unknown Unknown libpetsc.so.3.9.0 00007F4607DBFFC8 PCSetUp_GAMG Unknown Unknown libpetsc.so.3.9.0 00007F4607E36A74 PCSetUp Unknown Unknown libpetsc.so.3.9.0 00007F4607EEAA7E KSPSetUp Unknown Unknown libpetsc.so.3.9.0 00007F4607EFF40D kspsetup_ Unknown Unknown par_test_poisson_ 0000000000440FC6 Unknown Unknown Unknown par_test_poisson_ 000000000045EDCE Unknown Unknown Unknown par_test_poisson_ 000000000040E4F5 Unknown Unknown Unknown par_test_poisson_ 00000000004055AE Unknown Unknown Unknown libc-2.22.so 00007F46034FB6E5 __libc_start_main Unknown Unknown par_test_poisson_ 00000000004054B9 Unknown Unknown Unknown From knepley at gmail.com Mon Nov 19 05:26:00 2018 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 19 Nov 2018 06:26:00 -0500 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue In-Reply-To: <5BF295DC.30801@cimne.upc.edu> References: <5BE30C87.10204@cimne.upc.edu> <5BE420E4.8020800@cimne.upc.edu> <5BE42729.3090307@cimne.upc.edu> <5BF295DC.30801@cimne.upc.edu> Message-ID: On Mon, Nov 19, 2018 at 5:52 AM "Alberto F. Mart?n" wrote: > Dear Mark, Dear Matthew, > > in order to discard load imbalance as the cause of the reported weak > scaling issue in the GAMG preconditioner > set-up stage (as you said, we were feeding GAMG with a suboptimal mesh > distribution, having empty processors, > among others), we simplified the weak scaling test by considering the > standard body-fitted trilinear (Q1) FE discretization > of the 3D Poisson problem on a unit cube discretized with a *uniform, > structured hexahedral mesh,* *partitioned* > *optimally (by hand) among processors*, with a fixed load/core of 30**3 > hexahedra/core. Thus, all processors have the same load > (up-to strong Dirichlet boundary conditions on the subdomains touching the > global boundary), and the edge-cut is minimum. > > We used the following GAMG preconditioner options: > > -pc_type gamg > -pc_gamg_type agg > -pc_gamg_est_ksp_type cg > -mg_levels_esteig_ksp_type cg > -mg_coarse_sub_pc_type cholesky > -mg_coarse_sub_pc_factor_mat_ordering_type nd > -pc_gamg_process_eq_limit 50 > -pc_gamg_square_graph 10 > -pc_gamg_agg_nsmooths 1 > > The results that we obtained for 48 (4x4x3 subdomains), 10,368 (24x24x18 > subdomains), and 16,464 > (28x28x21 subdomains) CPU cores are as follows: > > **preconditioner set up** > [0.9844961860, *7.017674042*, *12.10154881*] > > **PCG stage** > [0.5849160422, 1.515251888, 1.859617710] > > **number of PCG iterations** > [9,14,15] > > As you can observe, *there is still a significant time increase when > scaling the problem from 48 to 10K/16K MPI tasks* > *for the preconditioner setup stage. *This time increase is not as > significant for the PCG stage. > Actually, its almost perfect for PCG, given that 14/9 * 0.98 = 1.52 (Eff 100%) 15/14 * 1.52 = 1.63 (Eff 88%) Mark would have better comments on the scalability of the setup stage. > Please find attached the combined > output of -ksp_view and -log_view for these three points of the weak > scaling curve. > > Given these results, I am starting to suspect that something within the > underlying software + hardware stack might be > responsible for this. I am using OpenMPI 1.10.7 + Intel compilers version > 18.0. The underlying supercomputer is MN-IV at > BSC (https://www.bsc.es/marenostrum/marenostrum/technical-information). > Have you ever conducted a weak scaling test > of GAMG with OpenMPI on a similar computer architecture? Can you share > your experience with us? > (versions tested, outcome, etc.) > > We also tried an alternative MPI library, Intel(R) MPI Library for Linux* > OS, Version 2018 Update 4 Build 20180823 (id: 18555), > *without success. *For this MPI library, the preconditioner set-up stage > crashes (find attached stack frames, and internal MPI library > errors) for the largest two core counts (it did not crash for the 48 CPU > cores case), while it did not crash with OpenMPI 1.10.7. > Have you ever experienced errors like the ones > attached? Is there anyway to set up PETSc such that the subroutine that > crashes is replaced by an alternative implementation of > the same concept? (this would be just a workaround). > It certainly feels like an MPI (or driver) bug: libmpi.so.12 00007F460E3AEBC8 PMPIDI_CH3I_Progr Unknown Unknown libmpi.so.12 00007F460E72B90C MPI_Testall Unknown Unknown libpetsc.so.3.9.0 00007F4607391FFE PetscCommBuildTwo Unknown Unknown You can try another variant using -build_twosided I think ibarrier is currently the default if supported, but -help should tell you. Thanks, Matt > It might be a BUG in the Intel MPI library, although I cannot confirm it. > We also got > these errors with the unfitted FEM+space-filling curves version of our > code. > > Thanks a lot for your help and valuable feedback! > Best regards, > Alberto. > > > > > > > > > > On 08/11/18 17:29, Mark Adams wrote: > > >> I did not configured PETSc with ParMetis support. Should I? >> >> I figured it out when I tried to use "-pc_gamg_repartition". PETSc >> complained that it was not compiled with ParMetis support. >> > > You need ParMetis, or some parallel mesh partitioner, configured to use > repartitioning. I would guess that "-pc_gamg_repartition" would not help > and might hurt, because it just does the coarse grids, not the fine grid. > But it is worth a try. Just configure with --download-parmetis > > The problem is that you are using space filling curves on the background > grid and are getting empty processors. Right? The mesh setup phase is not > super optimized, but your times > > And you said in your attachment that you added the near null space, but > just the constant vector. I trust you mean the three translational rigid > body modes. That is the default and so you should not see any difference. > If you added one vector of all 1s then that would be bad. You also want the > rotational rigid body modes. Now, you are converging pretty well and if > your solution does not have much rotation in it the the rotational modes > are not needed, but they are required for optimality in general. > > > > -- > Alberto F. Mart?n-Huertas > Senior Researcher, PhD. Computational Science > Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) > Parc Mediterrani de la Tecnologia, UPC > Esteve Terradas 5, Building C3, Office 215, > 08860 Castelldefels (Barcelona, Spain) > Tel.: (+34) 9341 34223e-mail:amartin at cimne.upc.edu > > FEMPAR project co-founder > web: http://www.fempar.org > > ________________ > IMPORTANT NOTICE > All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in > order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by > letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivan.voznyuk.work at gmail.com Mon Nov 19 06:24:21 2018 From: ivan.voznyuk.work at gmail.com (Ivan Voznyuk) Date: Mon, 19 Nov 2018 13:24:21 +0100 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: Hi guys, So, I tried to configure petsc with the advised options, make all and make tests went well. However at the phase of python3 ./simple_test.py I obtained error after error related to the import of MKL libraries. Solving them one after another one, I finished with a critical violation error and abondeneed. Instead, I reinstalled fully the system and done the following: 1. Install system's openmpi 2. Installed system's OpenBLAS 3. Configured petsc with mpi compilators, download mumps, scalapack, with complex values 4. Made sure that in parameters there is nothing sequential And it worked!! So, thank you colleagues for 1. Helping me with definition of the problem 2. Your advices for solving this problem. In close future I will try to do the same but with Intel MKL. Cuz apperently, it may be much faster! And thus it represents a real interest! So I ll try to do it and share with you! Thanks, Ivan On Fri, Nov 16, 2018, 19:18 Dave May > > On Fri, 16 Nov 2018 at 19:02, Ivan Voznyuk > wrote: > >> Hi Satish, >> Thanks for your reply. >> >> Bad news... I tested 2 solutions that you proposed, none has worked. >> > > You don't still have > OMP_NUM_THREADS=1 > set in your environment do you? > > Can you print the value of this env variable from within your python code > and confirm it's not 1 > > > >> >> 1. --with-blaslapack-dir=/opt/intel/mkl >> --with-mkl_pardiso-dir=/opt/intel/mkl installed well, without any problems. >> However, the code is still turning in sequential way. >> 2. When I changed -lmkl_sequential to -lmkl_intel_thread -liomp, he at >> first did not find the liomp, so I had to create a symbolic link of libiomp5.so >> to /lib. >> At the launching of the .py code I had to go with: >> export >> LD_PRELOAD=/opt/intel/mkl/lib/intel64/libmkl_core.so:/opt/intel/mkl/lib/intel64/libmkl_sequential.so >> and >> export LD_LIBRARY_PATH=/opt/petsc/petsc1/arch-linux2-c-debug/lib/ >> >> But still it does not solve the given problem and code is still running >> sequentially... >> >> May be you have some other ideas? >> >> Thanks, >> Ivan >> >> >> >> >> On Fri, Nov 16, 2018 at 6:11 PM Balay, Satish wrote: >> >>> Yes PETSc prefers sequential MKL - as MPI handles parallelism. >>> >>> One way to trick petsc configure to use threaded MKL is to enable >>> pardiso. i.e: >>> >>> --with-blaslapack-dir=/opt/intel/mkl >>> --with-mkl_pardiso-dir=/opt/intel/mkl >>> >>> >>> http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2018/11/15/configure_master_arch-pardiso_grind.log >>> >>> BLAS/LAPACK: -Wl,-rpath,/soft/com/packages/intel/16/u3/mkl/lib/intel64 >>> -L/soft/com/packages/intel/16/u3/mkl/lib/intel64 -lmkl_intel_lp64 >>> -lmkl_core -lmkl_intel_thread -liomp5 -ldl -lpthread >>> >>> Or you can manually specify the correct MKL library list [with >>> threading] via --with-blaslapack-lib option. >>> >>> Satish >>> >>> On Fri, 16 Nov 2018, Ivan Voznyuk via petsc-users wrote: >>> >>> > Hi, >>> > You were totally right: no miracle, parallelization does come from >>> > multithreading. We checked Option 1/: played with OMP_NUM_THREADS=1 it >>> > changed computational time. >>> > >>> > So, I reinstalled everything (starting with Ubuntu ending with petsc) >>> and >>> > configured the following things: >>> > >>> > - installed system's ompenmpi >>> > - installed Intel MKL Blas / Lapack >>> > - configured PETSC as ./configure --with-cc=mpicc --with-fc=mpif90 >>> > --with-cxx=mpicxx --with-blas-lapack-dir=/opt/intel/mkl/lib/intel64 >>> > --download-scalapack --download-mumps --with-hwloc --with-shared >>> > --with-openmp=1 --with-pthread=1 --with-scalar-type=complex >>> > hoping that it would take into account blas multithreading >>> > - installed petsc4py >>> > >>> > However, I do not get any parallelization... >>> > What I tried to do so far unsuccessfully : >>> > - play with OMP_NUM_THREADS >>> > - reinstall the system >>> > - ldd PETSc.cpython-35m-x86_64-linux-gnu.so yields lld_result.txt >>> (here >>> > attached) >>> > I noted that libmkl_sequential.so library there. Do you think this is >>> > normal? >>> > - I found a similar problem reported here: >>> > https://lists.mcs.anl.gov/pipermail/petsc-users/2016-March/028803.html >>> To >>> > solve this problem, developers recommended to replace -lmkl_sequential >>> to >>> > -lmkl_intel_thread options in PETSC_ARCH/lib/conf/petscvariables. >>> However, >>> > I did not find something that would be named like this (it might be a >>> > change of version) >>> > - Anyway, I replaced lmkl_sequential to lmkl_intel_thread in every >>> file of >>> > PETSC, but it changed nothing. >>> > >>> > As a result, in the new make.log (here attached ) I have a parameter >>> > #define PETSC_HAVE_LIBMKL_SEQUENTIAL 1 and option -lmkl_sequential >>> > >>> > Do you have any idea of what I should change in the initial options in >>> > order to obtain the blas multithreding parallelization? >>> > >>> > Thanks a lot for your help! >>> > >>> > Ivan >>> > >>> > >>> > >>> > >>> > >>> > >>> > On Fri, Nov 16, 2018 at 1:25 AM Dave May >>> wrote: >>> > >>> > > >>> > > >>> > > On Thu, 15 Nov 2018 at 17:44, Ivan via petsc-users < >>> > > petsc-users at mcs.anl.gov> wrote: >>> > > >>> > >> Hi Stefano, >>> > >> >>> > >> In fact, yes, we look at the htop output (and the resulting >>> computational >>> > >> time ofc). >>> > >> >>> > >> In our code we use MUMPS, which indeed depends on blas / lapack. So >>> I >>> > >> think this might be it! >>> > >> >>> > >> I will definetely check it (I mean the difference between our MUMPS, >>> > >> blas, lapack). >>> > >> >>> > >> If you have an idea of how we can verify on his PC that the source >>> of his >>> > >> parallelization does come from BLAS, please do not hesitate to tell >>> me! >>> > >> >>> > > >>> > > Option 1/ >>> > > * Set this environment variable >>> > > export OMP_NUM_THREADS=1 >>> > > * Re-run your "parallel" test. >>> > > * If the performance differs (job runs slower) compared with your >>> previous >>> > > run where you inferred parallelism was being employed, you can safely >>> > > assume that the parallelism observed comes from threads >>> > > >>> > > Option 2/ >>> > > * Re-configure PETSc to use a known BLAS implementation which does >>> not >>> > > support threads >>> > > * Re-compile PETSc >>> > > * Re-run your parallel test >>> > > * If the performance differs (job runs slower) compared with your >>> previous >>> > > run where you inferred parallelism was being employed, you can safely >>> > > assume that the parallelism observed comes from threads >>> > > >>> > > Option 3/ >>> > > * Use a PC which does not depend on BLAS at all, >>> > > e.g. -pc_type jacobi -pc_type bjacobi >>> > > * If the performance differs (job runs slower) compared with your >>> previous >>> > > run where you inferred parallelism was being employed, you can safely >>> > > assume that the parallelism observed comes from BLAS + threads >>> > > >>> > > >>> > > >>> > >> Thanks! >>> > >> >>> > >> Ivan >>> > >> On 15/11/2018 18:24, Stefano Zampini wrote: >>> > >> >>> > >> If you say your program is parallel by just looking at the output >>> from >>> > >> the top command, you are probably linking against a multithreaded >>> blas >>> > >> library >>> > >> >>> > >> Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < >>> > >> petsc-users at mcs.anl.gov> ha scritto: >>> > >> >>> > >>> On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk < >>> > >>> ivan.voznyuk.work at gmail.com> wrote: >>> > >>> >>> > >>>> Hi Matthew, >>> > >>>> >>> > >>>> Does it mean that by using just command python3 simple_code.py >>> (without >>> > >>>> mpiexec) you *cannot* obtain a parallel execution? >>> > >>>> >>> > >>> >>> > >>> As I wrote before, its not impossible. You could be directly >>> calling >>> > >>> PMI, but I do not think you are doing that. >>> > >>> >>> > >>> >>> > >>>> It s been 5 days we are trying to understand with my colleague >>> how he >>> > >>>> managed to do so. >>> > >>>> It means that by using simply python3 simple_code.py he gets 8 >>> > >>>> processors workiing. >>> > >>>> By the way, we wrote in his code few lines: >>> > >>>> rank = PETSc.COMM_WORLD.Get_rank() >>> > >>>> size = PETSc.COMM_WORLD.Get_size() >>> > >>>> and we got rank = 0, size = 1 >>> > >>>> >>> > >>> >>> > >>> This is MPI telling you that you are only running on 1 processes. >>> > >>> >>> > >>> >>> > >>>> However, we compilator arrives to KSP.solve(), somehow it turns >>> on 8 >>> > >>>> processors. >>> > >>>> >>> > >>> >>> > >>> Why do you think its running on 8 processes? >>> > >>> >>> > >>> >>> > >>>> This problem is solved on his PC in 5-8 sec (in parallel, using >>> *python3 >>> > >>>> simple_code.py*), on mine it takes 70-90 secs (in sequantial, but >>> with >>> > >>>> the same command *python3 simple_code.py*) >>> > >>>> >>> > >>> >>> > >>> I think its much more likely that there are differences in the >>> solver >>> > >>> (use -ksp_view to see exactly what solver was used), then >>> > >>> to think it is parallelism. Moreover, you would never ever ever >>> see that >>> > >>> much speedup on a laptop since all these computations >>> > >>> are bandwidth limited. >>> > >>> >>> > >>> Thanks, >>> > >>> >>> > >>> Matt >>> > >>> >>> > >>> >>> > >>>> So, conclusion is that on his computer this code works in the >>> same way >>> > >>>> as scipy: all the code is executed in sequantial mode, but when >>> it comes to >>> > >>>> solution of system of linear equations, it runs on all available >>> > >>>> processors. All this with just running python3 my_code.py >>> (without any >>> > >>>> mpi-smth) >>> > >>>> >>> > >>>> Is it an exception / abnormal behavior? I mean, is it something >>> > >>>> irregular that you, developers, have never seen? >>> > >>>> >>> > >>>> Thanks and have a good evening! >>> > >>>> Ivan >>> > >>>> >>> > >>>> P.S. I don't think I know the answer regarding Scipy... >>> > >>>> >>> > >>>> >>> > >>>> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley < >>> knepley at gmail.com> >>> > >>>> wrote: >>> > >>>> >>> > >>>>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < >>> > >>>>> ivan.voznyuk.work at gmail.com> wrote: >>> > >>>>> >>> > >>>>>> Hi Matthew, >>> > >>>>>> Thanks for your reply! >>> > >>>>>> >>> > >>>>>> Let me precise what I mean by defining few questions: >>> > >>>>>> >>> > >>>>>> 1. In order to obtain a parallel execution of simple_code.py, >>> do I >>> > >>>>>> need to go with mpiexec python3 simple_code.py, or I can just >>> launch >>> > >>>>>> python3 simple_code.py? >>> > >>>>>> >>> > >>>>> >>> > >>>>> mpiexec -n 2 python3 simple_code.py >>> > >>>>> >>> > >>>>> >>> > >>>>>> 2. This simple_code.py consists of 2 parts: a) preparation of >>> matrix >>> > >>>>>> b) solving the system of linear equations with PETSc. If I >>> launch mpirun >>> > >>>>>> (or mpiexec) -np 8 python3 simple_code.py, I suppose that I >>> will basically >>> > >>>>>> obtain 8 matrices and 8 systems to solve. However, I need to >>> prepare only >>> > >>>>>> one matrix, but launch this code in parallel on 8 processors. >>> > >>>>>> >>> > >>>>> >>> > >>>>> When you create the Mat object, you give it a communicator (here >>> > >>>>> PETSC_COMM_WORLD). That allows us to distribute the data. This >>> is all >>> > >>>>> covered extensively in the manual and the online tutorials, as >>> well as the >>> > >>>>> example code. >>> > >>>>> >>> > >>>>> >>> > >>>>>> In fact, here attached you will find a similar code >>> (scipy_code.py) >>> > >>>>>> with only one difference: the system of linear equations is >>> solved with >>> > >>>>>> scipy. So when I solve it, I can clearly see that the solution >>> is obtained >>> > >>>>>> in a parallel way. However, I do not use the command mpirun (or >>> mpiexec). I >>> > >>>>>> just go with python3 scipy_code.py. >>> > >>>>>> >>> > >>>>> >>> > >>>>> Why do you think its running in parallel? >>> > >>>>> >>> > >>>>> Thanks, >>> > >>>>> >>> > >>>>> Matt >>> > >>>>> >>> > >>>>> >>> > >>>>>> In this case, the first part (creation of the sparse matrix) is >>> not >>> > >>>>>> parallel, whereas the solution of system is found in a parallel >>> way. >>> > >>>>>> So my question is, Do you think that it s possible to have the >>> same >>> > >>>>>> behavior with PETSC? And what do I need for this? >>> > >>>>>> >>> > >>>>>> I am asking this because for my colleague it worked! It means >>> that he >>> > >>>>>> launches the simple_code.py on his computer using the command >>> python3 >>> > >>>>>> simple_code.py (and not mpi-smth python3 simple_code.py) and he >>> obtains a >>> > >>>>>> parallel execution of the same code. >>> > >>>>>> >>> > >>>>>> Thanks for your help! >>> > >>>>>> Ivan >>> > >>>>>> >>> > >>>>>> >>> > >>>>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley < >>> knepley at gmail.com> >>> > >>>>>> wrote: >>> > >>>>>> >>> > >>>>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >>> > >>>>>>> petsc-users at mcs.anl.gov> wrote: >>> > >>>>>>> >>> > >>>>>>>> Dear PETSC community, >>> > >>>>>>>> >>> > >>>>>>>> I have a question regarding the parallel execution of >>> petsc4py. >>> > >>>>>>>> >>> > >>>>>>>> I have a simple code (here attached simple_code.py) which >>> solves a >>> > >>>>>>>> system of linear equations Ax=b using petsc4py. To execute >>> it, I use the >>> > >>>>>>>> command python3 simple_code.py which yields a sequential >>> performance. With >>> > >>>>>>>> a colleague of my, we launched this code on his computer, and >>> this time the >>> > >>>>>>>> execution was in parallel. Although, he used the same command >>> python3 >>> > >>>>>>>> simple_code.py (without mpirun, neither mpiexec). >>> > >>>>>>>> >>> > >>>>>>> I am not sure what you mean. To run MPI programs in parallel, >>> you >>> > >>>>>>> need a launcher like mpiexec or mpirun. There are Python >>> programs (like >>> > >>>>>>> nemesis) that use the launcher API directly (called PMI), but >>> that is not >>> > >>>>>>> part of petsc4py. >>> > >>>>>>> >>> > >>>>>>> Thanks, >>> > >>>>>>> >>> > >>>>>>> Matt >>> > >>>>>>> >>> > >>>>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, >>> PETSc >>> > >>>>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in >>> virtualenv >>> > >>>>>>>> >>> > >>>>>>>> In order to parallelize it, I have already tried: >>> > >>>>>>>> - use 2 different PCs >>> > >>>>>>>> - use Ubuntu 16.04, 18.04 >>> > >>>>>>>> - use different architectures (arch-linux2-c-debug, >>> > >>>>>>>> linux-gnu-c-debug, etc) >>> > >>>>>>>> - ofc use different configurations (my present config can be >>> found >>> > >>>>>>>> in make.log that I attached here) >>> > >>>>>>>> - mpi from mpich, openmpi >>> > >>>>>>>> >>> > >>>>>>>> Nothing worked. >>> > >>>>>>>> >>> > >>>>>>>> Do you have any ideas? >>> > >>>>>>>> >>> > >>>>>>>> Thanks and have a good day, >>> > >>>>>>>> Ivan >>> > >>>>>>>> >>> > >>>>>>>> -- >>> > >>>>>>>> Ivan VOZNYUK >>> > >>>>>>>> PhD in Computational Electromagnetics >>> > >>>>>>>> >>> > >>>>>>> >>> > >>>>>>> >>> > >>>>>>> -- >>> > >>>>>>> What most experimenters take for granted before they begin >>> their >>> > >>>>>>> experiments is infinitely more interesting than any results to >>> which their >>> > >>>>>>> experiments lead. >>> > >>>>>>> -- Norbert Wiener >>> > >>>>>>> >>> > >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>> > >>>>>>> >>> > >>>>>>> >>> > >>>>>> >>> > >>>>>> >>> > >>>>>> -- >>> > >>>>>> Ivan VOZNYUK >>> > >>>>>> PhD in Computational Electromagnetics >>> > >>>>>> +33 (0)6.95.87.04.55 >>> > >>>>>> My webpage >>> > >>>>>> My LinkedIn >>> > >>>>>> >>> > >>>>> >>> > >>>>> >>> > >>>>> -- >>> > >>>>> What most experimenters take for granted before they begin their >>> > >>>>> experiments is infinitely more interesting than any results to >>> which their >>> > >>>>> experiments lead. >>> > >>>>> -- Norbert Wiener >>> > >>>>> >>> > >>>>> https://www.cse.buffalo.edu/~knepley/ >>> > >>>>> >>> > >>>>> >>> > >>>> >>> > >>>> >>> > >>>> -- >>> > >>>> Ivan VOZNYUK >>> > >>>> PhD in Computational Electromagnetics >>> > >>>> +33 (0)6.95.87.04.55 >>> > >>>> My webpage >>> > >>>> My LinkedIn >>> > >>>> >>> > >>> >>> > >>> >>> > >>> -- >>> > >>> What most experimenters take for granted before they begin their >>> > >>> experiments is infinitely more interesting than any results to >>> which their >>> > >>> experiments lead. >>> > >>> -- Norbert Wiener >>> > >>> >>> > >>> https://www.cse.buffalo.edu/~knepley/ >>> > >>> >>> > >>> >>> > >> >>> > >>> > >>> >>> >> >> -- >> Ivan VOZNYUK >> PhD in Computational Electromagnetics >> +33 (0)6.95.87.04.55 >> My webpage >> My LinkedIn >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 19 06:30:19 2018 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 19 Nov 2018 07:30:19 -0500 Subject: [petsc-users] petsc4py help with parallel execution In-Reply-To: References: Message-ID: On Mon, Nov 19, 2018 at 7:25 AM Ivan Voznyuk via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hi guys, > > So, I tried to configure petsc with the advised options, make all and make > tests went well. > > However at the phase of python3 ./simple_test.py I obtained error after > error related to the import of MKL libraries. Solving them one after > another one, I finished with a critical violation error and abondeneed. > > Instead, I reinstalled fully the system and done the following: > 1. Install system's openmpi > 2. Installed system's OpenBLAS > 3. Configured petsc with mpi compilators, download mumps, scalapack, with > complex values > 4. Made sure that in parameters there is nothing sequential > And it worked!! > > So, thank you colleagues for > 1. Helping me with definition of the problem > 2. Your advices for solving this problem. > > In close future I will try to do the same but with Intel MKL. Cuz > apperently, it may be much faster! And thus it represents a real interest! > I would be careful with this conclusion. All our tests, and everything I have ever read, indicates that it is not faster than just actually running in parallel, mpiexec -n

Matt > So I ll try to do it and share with you! > > Thanks, > Ivan > > On Fri, Nov 16, 2018, 19:18 Dave May >> >> >> On Fri, 16 Nov 2018 at 19:02, Ivan Voznyuk >> wrote: >> >>> Hi Satish, >>> Thanks for your reply. >>> >>> Bad news... I tested 2 solutions that you proposed, none has worked. >>> >> >> You don't still have >> OMP_NUM_THREADS=1 >> set in your environment do you? >> >> Can you print the value of this env variable from within your python code >> and confirm it's not 1 >> >> >> >>> >>> 1. --with-blaslapack-dir=/opt/intel/mkl >>> --with-mkl_pardiso-dir=/opt/intel/mkl installed well, without any problems. >>> However, the code is still turning in sequential way. >>> 2. When I changed -lmkl_sequential to -lmkl_intel_thread -liomp, he at >>> first did not find the liomp, so I had to create a symbolic link of libiomp5.so >>> to /lib. >>> At the launching of the .py code I had to go with: >>> export >>> LD_PRELOAD=/opt/intel/mkl/lib/intel64/libmkl_core.so:/opt/intel/mkl/lib/intel64/libmkl_sequential.so >>> and >>> export LD_LIBRARY_PATH=/opt/petsc/petsc1/arch-linux2-c-debug/lib/ >>> >>> But still it does not solve the given problem and code is still running >>> sequentially... >>> >>> May be you have some other ideas? >>> >>> Thanks, >>> Ivan >>> >>> >>> >>> >>> On Fri, Nov 16, 2018 at 6:11 PM Balay, Satish wrote: >>> >>>> Yes PETSc prefers sequential MKL - as MPI handles parallelism. >>>> >>>> One way to trick petsc configure to use threaded MKL is to enable >>>> pardiso. i.e: >>>> >>>> --with-blaslapack-dir=/opt/intel/mkl >>>> --with-mkl_pardiso-dir=/opt/intel/mkl >>>> >>>> >>>> http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2018/11/15/configure_master_arch-pardiso_grind.log >>>> >>>> BLAS/LAPACK: -Wl,-rpath,/soft/com/packages/intel/16/u3/mkl/lib/intel64 >>>> -L/soft/com/packages/intel/16/u3/mkl/lib/intel64 -lmkl_intel_lp64 >>>> -lmkl_core -lmkl_intel_thread -liomp5 -ldl -lpthread >>>> >>>> Or you can manually specify the correct MKL library list [with >>>> threading] via --with-blaslapack-lib option. >>>> >>>> Satish >>>> >>>> On Fri, 16 Nov 2018, Ivan Voznyuk via petsc-users wrote: >>>> >>>> > Hi, >>>> > You were totally right: no miracle, parallelization does come from >>>> > multithreading. We checked Option 1/: played with OMP_NUM_THREADS=1 it >>>> > changed computational time. >>>> > >>>> > So, I reinstalled everything (starting with Ubuntu ending with petsc) >>>> and >>>> > configured the following things: >>>> > >>>> > - installed system's ompenmpi >>>> > - installed Intel MKL Blas / Lapack >>>> > - configured PETSC as ./configure --with-cc=mpicc --with-fc=mpif90 >>>> > --with-cxx=mpicxx --with-blas-lapack-dir=/opt/intel/mkl/lib/intel64 >>>> > --download-scalapack --download-mumps --with-hwloc --with-shared >>>> > --with-openmp=1 --with-pthread=1 --with-scalar-type=complex >>>> > hoping that it would take into account blas multithreading >>>> > - installed petsc4py >>>> > >>>> > However, I do not get any parallelization... >>>> > What I tried to do so far unsuccessfully : >>>> > - play with OMP_NUM_THREADS >>>> > - reinstall the system >>>> > - ldd PETSc.cpython-35m-x86_64-linux-gnu.so yields lld_result.txt >>>> (here >>>> > attached) >>>> > I noted that libmkl_sequential.so library there. Do you think this is >>>> > normal? >>>> > - I found a similar problem reported here: >>>> > >>>> https://lists.mcs.anl.gov/pipermail/petsc-users/2016-March/028803.html >>>> To >>>> > solve this problem, developers recommended to replace >>>> -lmkl_sequential to >>>> > -lmkl_intel_thread options in PETSC_ARCH/lib/conf/petscvariables. >>>> However, >>>> > I did not find something that would be named like this (it might be a >>>> > change of version) >>>> > - Anyway, I replaced lmkl_sequential to lmkl_intel_thread in every >>>> file of >>>> > PETSC, but it changed nothing. >>>> > >>>> > As a result, in the new make.log (here attached ) I have a parameter >>>> > #define PETSC_HAVE_LIBMKL_SEQUENTIAL 1 and option -lmkl_sequential >>>> > >>>> > Do you have any idea of what I should change in the initial options in >>>> > order to obtain the blas multithreding parallelization? >>>> > >>>> > Thanks a lot for your help! >>>> > >>>> > Ivan >>>> > >>>> > >>>> > >>>> > >>>> > >>>> > >>>> > On Fri, Nov 16, 2018 at 1:25 AM Dave May >>>> wrote: >>>> > >>>> > > >>>> > > >>>> > > On Thu, 15 Nov 2018 at 17:44, Ivan via petsc-users < >>>> > > petsc-users at mcs.anl.gov> wrote: >>>> > > >>>> > >> Hi Stefano, >>>> > >> >>>> > >> In fact, yes, we look at the htop output (and the resulting >>>> computational >>>> > >> time ofc). >>>> > >> >>>> > >> In our code we use MUMPS, which indeed depends on blas / lapack. >>>> So I >>>> > >> think this might be it! >>>> > >> >>>> > >> I will definetely check it (I mean the difference between our >>>> MUMPS, >>>> > >> blas, lapack). >>>> > >> >>>> > >> If you have an idea of how we can verify on his PC that the source >>>> of his >>>> > >> parallelization does come from BLAS, please do not hesitate to >>>> tell me! >>>> > >> >>>> > > >>>> > > Option 1/ >>>> > > * Set this environment variable >>>> > > export OMP_NUM_THREADS=1 >>>> > > * Re-run your "parallel" test. >>>> > > * If the performance differs (job runs slower) compared with your >>>> previous >>>> > > run where you inferred parallelism was being employed, you can >>>> safely >>>> > > assume that the parallelism observed comes from threads >>>> > > >>>> > > Option 2/ >>>> > > * Re-configure PETSc to use a known BLAS implementation which does >>>> not >>>> > > support threads >>>> > > * Re-compile PETSc >>>> > > * Re-run your parallel test >>>> > > * If the performance differs (job runs slower) compared with your >>>> previous >>>> > > run where you inferred parallelism was being employed, you can >>>> safely >>>> > > assume that the parallelism observed comes from threads >>>> > > >>>> > > Option 3/ >>>> > > * Use a PC which does not depend on BLAS at all, >>>> > > e.g. -pc_type jacobi -pc_type bjacobi >>>> > > * If the performance differs (job runs slower) compared with your >>>> previous >>>> > > run where you inferred parallelism was being employed, you can >>>> safely >>>> > > assume that the parallelism observed comes from BLAS + threads >>>> > > >>>> > > >>>> > > >>>> > >> Thanks! >>>> > >> >>>> > >> Ivan >>>> > >> On 15/11/2018 18:24, Stefano Zampini wrote: >>>> > >> >>>> > >> If you say your program is parallel by just looking at the output >>>> from >>>> > >> the top command, you are probably linking against a multithreaded >>>> blas >>>> > >> library >>>> > >> >>>> > >> Il giorno Gio 15 Nov 2018, 20:09 Matthew Knepley via petsc-users < >>>> > >> petsc-users at mcs.anl.gov> ha scritto: >>>> > >> >>>> > >>> On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk < >>>> > >>> ivan.voznyuk.work at gmail.com> wrote: >>>> > >>> >>>> > >>>> Hi Matthew, >>>> > >>>> >>>> > >>>> Does it mean that by using just command python3 simple_code.py >>>> (without >>>> > >>>> mpiexec) you *cannot* obtain a parallel execution? >>>> > >>>> >>>> > >>> >>>> > >>> As I wrote before, its not impossible. You could be directly >>>> calling >>>> > >>> PMI, but I do not think you are doing that. >>>> > >>> >>>> > >>> >>>> > >>>> It s been 5 days we are trying to understand with my colleague >>>> how he >>>> > >>>> managed to do so. >>>> > >>>> It means that by using simply python3 simple_code.py he gets 8 >>>> > >>>> processors workiing. >>>> > >>>> By the way, we wrote in his code few lines: >>>> > >>>> rank = PETSc.COMM_WORLD.Get_rank() >>>> > >>>> size = PETSc.COMM_WORLD.Get_size() >>>> > >>>> and we got rank = 0, size = 1 >>>> > >>>> >>>> > >>> >>>> > >>> This is MPI telling you that you are only running on 1 processes. >>>> > >>> >>>> > >>> >>>> > >>>> However, we compilator arrives to KSP.solve(), somehow it turns >>>> on 8 >>>> > >>>> processors. >>>> > >>>> >>>> > >>> >>>> > >>> Why do you think its running on 8 processes? >>>> > >>> >>>> > >>> >>>> > >>>> This problem is solved on his PC in 5-8 sec (in parallel, using >>>> *python3 >>>> > >>>> simple_code.py*), on mine it takes 70-90 secs (in sequantial, >>>> but with >>>> > >>>> the same command *python3 simple_code.py*) >>>> > >>>> >>>> > >>> >>>> > >>> I think its much more likely that there are differences in the >>>> solver >>>> > >>> (use -ksp_view to see exactly what solver was used), then >>>> > >>> to think it is parallelism. Moreover, you would never ever ever >>>> see that >>>> > >>> much speedup on a laptop since all these computations >>>> > >>> are bandwidth limited. >>>> > >>> >>>> > >>> Thanks, >>>> > >>> >>>> > >>> Matt >>>> > >>> >>>> > >>> >>>> > >>>> So, conclusion is that on his computer this code works in the >>>> same way >>>> > >>>> as scipy: all the code is executed in sequantial mode, but when >>>> it comes to >>>> > >>>> solution of system of linear equations, it runs on all available >>>> > >>>> processors. All this with just running python3 my_code.py >>>> (without any >>>> > >>>> mpi-smth) >>>> > >>>> >>>> > >>>> Is it an exception / abnormal behavior? I mean, is it something >>>> > >>>> irregular that you, developers, have never seen? >>>> > >>>> >>>> > >>>> Thanks and have a good evening! >>>> > >>>> Ivan >>>> > >>>> >>>> > >>>> P.S. I don't think I know the answer regarding Scipy... >>>> > >>>> >>>> > >>>> >>>> > >>>> On Thu, Nov 15, 2018 at 2:39 PM Matthew Knepley < >>>> knepley at gmail.com> >>>> > >>>> wrote: >>>> > >>>> >>>> > >>>>> On Thu, Nov 15, 2018 at 8:07 AM Ivan Voznyuk < >>>> > >>>>> ivan.voznyuk.work at gmail.com> wrote: >>>> > >>>>> >>>> > >>>>>> Hi Matthew, >>>> > >>>>>> Thanks for your reply! >>>> > >>>>>> >>>> > >>>>>> Let me precise what I mean by defining few questions: >>>> > >>>>>> >>>> > >>>>>> 1. In order to obtain a parallel execution of simple_code.py, >>>> do I >>>> > >>>>>> need to go with mpiexec python3 simple_code.py, or I can just >>>> launch >>>> > >>>>>> python3 simple_code.py? >>>> > >>>>>> >>>> > >>>>> >>>> > >>>>> mpiexec -n 2 python3 simple_code.py >>>> > >>>>> >>>> > >>>>> >>>> > >>>>>> 2. This simple_code.py consists of 2 parts: a) preparation of >>>> matrix >>>> > >>>>>> b) solving the system of linear equations with PETSc. If I >>>> launch mpirun >>>> > >>>>>> (or mpiexec) -np 8 python3 simple_code.py, I suppose that I >>>> will basically >>>> > >>>>>> obtain 8 matrices and 8 systems to solve. However, I need to >>>> prepare only >>>> > >>>>>> one matrix, but launch this code in parallel on 8 processors. >>>> > >>>>>> >>>> > >>>>> >>>> > >>>>> When you create the Mat object, you give it a communicator (here >>>> > >>>>> PETSC_COMM_WORLD). That allows us to distribute the data. This >>>> is all >>>> > >>>>> covered extensively in the manual and the online tutorials, as >>>> well as the >>>> > >>>>> example code. >>>> > >>>>> >>>> > >>>>> >>>> > >>>>>> In fact, here attached you will find a similar code >>>> (scipy_code.py) >>>> > >>>>>> with only one difference: the system of linear equations is >>>> solved with >>>> > >>>>>> scipy. So when I solve it, I can clearly see that the solution >>>> is obtained >>>> > >>>>>> in a parallel way. However, I do not use the command mpirun >>>> (or mpiexec). I >>>> > >>>>>> just go with python3 scipy_code.py. >>>> > >>>>>> >>>> > >>>>> >>>> > >>>>> Why do you think its running in parallel? >>>> > >>>>> >>>> > >>>>> Thanks, >>>> > >>>>> >>>> > >>>>> Matt >>>> > >>>>> >>>> > >>>>> >>>> > >>>>>> In this case, the first part (creation of the sparse matrix) >>>> is not >>>> > >>>>>> parallel, whereas the solution of system is found in a >>>> parallel way. >>>> > >>>>>> So my question is, Do you think that it s possible to have the >>>> same >>>> > >>>>>> behavior with PETSC? And what do I need for this? >>>> > >>>>>> >>>> > >>>>>> I am asking this because for my colleague it worked! It means >>>> that he >>>> > >>>>>> launches the simple_code.py on his computer using the command >>>> python3 >>>> > >>>>>> simple_code.py (and not mpi-smth python3 simple_code.py) and >>>> he obtains a >>>> > >>>>>> parallel execution of the same code. >>>> > >>>>>> >>>> > >>>>>> Thanks for your help! >>>> > >>>>>> Ivan >>>> > >>>>>> >>>> > >>>>>> >>>> > >>>>>> On Thu, Nov 15, 2018 at 11:54 AM Matthew Knepley < >>>> knepley at gmail.com> >>>> > >>>>>> wrote: >>>> > >>>>>> >>>> > >>>>>>> On Thu, Nov 15, 2018 at 4:53 AM Ivan Voznyuk via petsc-users < >>>> > >>>>>>> petsc-users at mcs.anl.gov> wrote: >>>> > >>>>>>> >>>> > >>>>>>>> Dear PETSC community, >>>> > >>>>>>>> >>>> > >>>>>>>> I have a question regarding the parallel execution of >>>> petsc4py. >>>> > >>>>>>>> >>>> > >>>>>>>> I have a simple code (here attached simple_code.py) which >>>> solves a >>>> > >>>>>>>> system of linear equations Ax=b using petsc4py. To execute >>>> it, I use the >>>> > >>>>>>>> command python3 simple_code.py which yields a sequential >>>> performance. With >>>> > >>>>>>>> a colleague of my, we launched this code on his computer, >>>> and this time the >>>> > >>>>>>>> execution was in parallel. Although, he used the same >>>> command python3 >>>> > >>>>>>>> simple_code.py (without mpirun, neither mpiexec). >>>> > >>>>>>>> >>>> > >>>>>>> I am not sure what you mean. To run MPI programs in parallel, >>>> you >>>> > >>>>>>> need a launcher like mpiexec or mpirun. There are Python >>>> programs (like >>>> > >>>>>>> nemesis) that use the launcher API directly (called PMI), but >>>> that is not >>>> > >>>>>>> part of petsc4py. >>>> > >>>>>>> >>>> > >>>>>>> Thanks, >>>> > >>>>>>> >>>> > >>>>>>> Matt >>>> > >>>>>>> >>>> > >>>>>>>> My configuration: Ubuntu x86_64 Ubuntu 16.04, Intel Core i7, >>>> PETSc >>>> > >>>>>>>> 3.10.2, PETSC_ARCH=arch-linux2-c-debug, petsc4py 3.10.0 in >>>> virtualenv >>>> > >>>>>>>> >>>> > >>>>>>>> In order to parallelize it, I have already tried: >>>> > >>>>>>>> - use 2 different PCs >>>> > >>>>>>>> - use Ubuntu 16.04, 18.04 >>>> > >>>>>>>> - use different architectures (arch-linux2-c-debug, >>>> > >>>>>>>> linux-gnu-c-debug, etc) >>>> > >>>>>>>> - ofc use different configurations (my present config can be >>>> found >>>> > >>>>>>>> in make.log that I attached here) >>>> > >>>>>>>> - mpi from mpich, openmpi >>>> > >>>>>>>> >>>> > >>>>>>>> Nothing worked. >>>> > >>>>>>>> >>>> > >>>>>>>> Do you have any ideas? >>>> > >>>>>>>> >>>> > >>>>>>>> Thanks and have a good day, >>>> > >>>>>>>> Ivan >>>> > >>>>>>>> >>>> > >>>>>>>> -- >>>> > >>>>>>>> Ivan VOZNYUK >>>> > >>>>>>>> PhD in Computational Electromagnetics >>>> > >>>>>>>> >>>> > >>>>>>> >>>> > >>>>>>> >>>> > >>>>>>> -- >>>> > >>>>>>> What most experimenters take for granted before they begin >>>> their >>>> > >>>>>>> experiments is infinitely more interesting than any results >>>> to which their >>>> > >>>>>>> experiments lead. >>>> > >>>>>>> -- Norbert Wiener >>>> > >>>>>>> >>>> > >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>> > >>>>>>> >>>> > >>>>>>> >>>> > >>>>>> >>>> > >>>>>> >>>> > >>>>>> -- >>>> > >>>>>> Ivan VOZNYUK >>>> > >>>>>> PhD in Computational Electromagnetics >>>> > >>>>>> +33 (0)6.95.87.04.55 >>>> > >>>>>> My webpage >>>> > >>>>>> My LinkedIn >>>> > >>>>>> >>>> > >>>>> >>>> > >>>>> >>>> > >>>>> -- >>>> > >>>>> What most experimenters take for granted before they begin their >>>> > >>>>> experiments is infinitely more interesting than any results to >>>> which their >>>> > >>>>> experiments lead. >>>> > >>>>> -- Norbert Wiener >>>> > >>>>> >>>> > >>>>> https://www.cse.buffalo.edu/~knepley/ >>>> > >>>>> >>>> > >>>>> >>>> > >>>> >>>> > >>>> >>>> > >>>> -- >>>> > >>>> Ivan VOZNYUK >>>> > >>>> PhD in Computational Electromagnetics >>>> > >>>> +33 (0)6.95.87.04.55 >>>> > >>>> My webpage >>>> > >>>> My LinkedIn >>>> > >>>> >>>> > >>> >>>> > >>> >>>> > >>> -- >>>> > >>> What most experimenters take for granted before they begin their >>>> > >>> experiments is infinitely more interesting than any results to >>>> which their >>>> > >>> experiments lead. >>>> > >>> -- Norbert Wiener >>>> > >>> >>>> > >>> https://www.cse.buffalo.edu/~knepley/ >>>> > >>> >>>> > >>> >>>> > >> >>>> > >>>> > >>>> >>>> >>> >>> -- >>> Ivan VOZNYUK >>> PhD in Computational Electromagnetics >>> +33 (0)6.95.87.04.55 >>> My webpage >>> My LinkedIn >>> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amartin at cimne.upc.edu Mon Nov 19 07:40:20 2018 From: amartin at cimne.upc.edu (=?UTF-8?B?IkFsYmVydG8gRi4gTWFydMOtbiI=?=) Date: Mon, 19 Nov 2018 14:40:20 +0100 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue In-Reply-To: References: <5BE30C87.10204@cimne.upc.edu> <5BE420E4.8020800@cimne.upc.edu> <5BE42729.3090307@cimne.upc.edu> <5BF295DC.30801@cimne.upc.edu> Message-ID: <5BF2BD44.7010602@cimne.upc.edu> Dear Matthew, using either "-build_twosided allreduce" or "-build_twosided redscatter" workarounds the Intel MPI internal error! Besides, with Intel MPI, the weak scaling figures look like: **preconditioner set up** [0.8865120411, 2.598261833, 3.780511856] **PCG stage** [0.5701429844, 1.771002054, 2.292135954] **number of PCG iterations** [9,14,15] which now I consider much more reasonable, in particular, for the preconditioner set-up stage, compared to the ones that I obtained with OpenMPI 1.10.7 and "-build_twosided ibarrier", below. Best regards, Alberto. On 19/11/18 12:26, Matthew Knepley wrote: > On Mon, Nov 19, 2018 at 5:52 AM "Alberto F. Mart?n" > > wrote: > > Dear Mark, Dear Matthew, > > in order to discard load imbalance as the cause of the reported > weak scaling issue in the GAMG preconditioner > set-up stage (as you said, we were feeding GAMG with a suboptimal > mesh distribution, having empty processors, > among others), we simplified the weak scaling test by considering > the standard body-fitted trilinear (Q1) FE discretization > of the 3D Poisson problem on a unit cube discretized with a > */uniform, structured hexahedral mesh,/* /*partitioned*//* > *//*optimally (by hand) among processors*/, with a fixed load/core > of 30**3 hexahedra/core. Thus, all processors have the same load > (up-to strong Dirichlet boundary conditions on the subdomains > touching the global boundary), and the edge-cut is minimum. > > We used the following GAMG preconditioner options: > > -pc_type gamg > -pc_gamg_type agg > -pc_gamg_est_ksp_type cg > -mg_levels_esteig_ksp_type cg > -mg_coarse_sub_pc_type cholesky > -mg_coarse_sub_pc_factor_mat_ordering_type nd > -pc_gamg_process_eq_limit 50 > -pc_gamg_square_graph 10 > -pc_gamg_agg_nsmooths 1 > > The results that we obtained for 48 (4x4x3 subdomains), 10,368 > (24x24x18 subdomains), and 16,464 > (28x28x21 subdomains) CPU cores are as follows: > > **preconditioner set up** > [0.9844961860, *7.017674042*, *12.10154881*] > > **PCG stage** > [0.5849160422, 1.515251888, 1.859617710] > > **number of PCG iterations** > [9,14,15] > > As you can observe, *there is still a significant time increase > when scaling the problem from 48 to 10K/16K MPI tasks** > **for the preconditioner setup stage. *This time increase is not > as significant for the PCG stage. > > > Actually, its almost perfect for PCG, given that > > 14/9 * 0.98 = 1.52 (Eff 100%) > 15/14 * 1.52 = 1.63 (Eff 88%) > > Mark would have better comments on the scalability of the setup stage. > > **Please find attached the combined > output of -ksp_view and -log_view for these three points of the > weak scaling curve. > > Given these results, I am starting to suspect that something > within the underlying software + hardware stack might be > responsible for this. I am using OpenMPI 1.10.7 + Intel compilers > version 18.0. The underlying supercomputer is MN-IV at > BSC > (https://www.bsc.es/marenostrum/marenostrum/technical-information). Have > you ever conducted a weak scaling test > of GAMG with OpenMPI on a similar computer architecture? Can you > share your experience with us? > (versions tested, outcome, etc.) > > We also tried an alternative MPI library, Intel(R) MPI Library for > Linux* OS, Version 2018 Update 4 Build 20180823 (id: 18555), > *without success. *For this MPI library, the preconditioner set-up > stage crashes (find attached stack frames, and internal MPI library > errors) for the largest two core counts (it did not crash for the > 48 CPU cores case), while it did not crash with OpenMPI 1.10.7. > Have you ever experienced errors like the ones > attached? Is there anyway to set up PETSc such that the subroutine > that crashes is replaced by an alternative implementation of > the same concept? (this would be just a workaround). > > > It certainly feels like an MPI (or driver) bug: > > libmpi.so.12 00007F460E3AEBC8 PMPIDI_CH3I_Progr Unknown Unknown > libmpi.so.12 00007F460E72B90C MPI_Testall Unknown Unknown > libpetsc.so.3.9.0 00007F4607391FFE PetscCommBuildTwo Unknown Unknown > > You can try another variant using > > -build_twosided > > I think ibarrier is currently the default if supported, but -help > should tell you. > > Thanks, > > Matt > > It might be a BUG in the Intel MPI library, although I cannot > confirm it. We also got > these errors with the unfitted FEM+space-filling curves version of > our code. > > Thanks a lot for your help and valuable feedback! > Best regards, > Alberto. > > > > > > > > > > On 08/11/18 17:29, Mark Adams wrote: >> >> >> I did not configured PETSc with ParMetis support. Should I? >> >> I figured it out when I tried to use "-pc_gamg_repartition". >> PETSc complained that it was not compiled with ParMetis support. >> >> >> You need ParMetis, or some parallel mesh partitioner, configured >> to use repartitioning. I would guess that "-pc_gamg_repartition" >> would not help and might hurt, because it just does the coarse >> grids, not the fine grid. But it is worth a try. Just configure >> with --download-parmetis >> >> The problem is that you are using space filling curves on the >> background grid and are getting empty processors. Right? The >> mesh setup phase is not super optimized, but your times >> >> And you said in your attachment that you added the near null >> space, but just the constant vector. I trust you mean the three >> translational rigid body modes. That is the default and so you >> should not see any difference. If you added one vector of all 1s >> then that would be bad. You also want the rotational rigid body >> modes. Now, you are converging pretty well and if your solution >> does not have much rotation in it the the rotational modes are >> not needed, but they are required for optimality in general. >> > > -- > Alberto F. Mart?n-Huertas > Senior Researcher, PhD. Computational Science > Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) > Parc Mediterrani de la Tecnologia, UPC > Esteve Terradas 5, Building C3, Office 215, > 08860 Castelldefels (Barcelona, Spain) > Tel.: (+34) 9341 34223 > e-mail:amartin at cimne.upc.edu > > FEMPAR project co-founder > web:http://www.fempar.org > > ________________ > IMPORTANT NOTICE > All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in > order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by > letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -- Alberto F. Mart?n-Huertas Senior Researcher, PhD. Computational Science Centre Internacional de M?todes Num?rics a l'Enginyeria (CIMNE) Parc Mediterrani de la Tecnologia, UPC Esteve Terradas 5, Building C3, Office 215, 08860 Castelldefels (Barcelona, Spain) Tel.: (+34) 9341 34223 e-mail:amartin at cimne.upc.edu FEMPAR project co-founder web: http://www.fempar.org ________________ IMPORTANT NOTICE All personal data contained on this mail will be processed confidentially and registered in a file property of CIMNE in order to manage corporate communications. You may exercise the rights of access, rectification, erasure and object by letter sent to Ed. C1 Campus Norte UPC. Gran Capit?n s/n Barcelona. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Nov 19 08:04:35 2018 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 19 Nov 2018 09:04:35 -0500 Subject: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue In-Reply-To: References: <5BE30C87.10204@cimne.upc.edu> <5BE420E4.8020800@cimne.upc.edu> <5BE42729.3090307@cimne.upc.edu> <5BF295DC.30801@cimne.upc.edu> Message-ID: > > > > Mark would have better comments on the scalability of the setup stage. > > The first thing to verify is that the algorithm is scaling. If you coarsen too slowly then the coarse grids get large, with many non-zeros per row, and the cost of the matrix triple product can explode. You can check this by running with -info and grep on GAMG and then look/grep for line about grid "Complexity". It should be well below 1.5. With a new version of PETSc you can see this with -ksp_view. You can also look at the total number of flops in matrix-matrix kernels like MatPtAPNumeric below. That should increase slowly as you scale the problem size up. Next, the setup has two basic types of processes: 1) custom graph processing and other kernels and 2) matrix-matrix products (like the matrix triple product). (2) are normal, but complex, numerical kernels and you can use standard performance debugging techniques to figure out where the time is going. There are normal numerical loops and MPI communication. (1) is made up of custom methods that do a lot of different types of processing, but parallel graph processing is a big chunk of this. These processes are very unstructured and it is hard to predict what performance to expect from them. You really just have to dig in and look for where the time is spent. I have added timers help with this: PCGAMGGraph_AGG 12 1.0 3.3467e+00 1.0 4.58e+06 1.2 7.6e+06 6.7e+02 1.4e+02 5 0 2 1 6 5 0 2 1 6 22144 PCGAMGCoarse_AGG 12 1.0 8.9895e+00 1.0 1.35e+08 1.2 7.7e+07 6.2e+03 4.7e+02 14 10 16 51 18 14 10 16 51 18 233748 PCGAMGProl_AGG 12 1.0 3.9328e+00 1.0 0.00e+00 0.0 9.1e+06 1.5e+03 1.9e+02 6 0 2 1 7 6 0 2 1 7 0 PCGAMGPOpt_AGG 12 1.0 6.3192e+00 1.0 7.43e+07 1.1 3.9e+07 9.0e+02 5.0e+02 9 6 8 4 19 9 6 8 4 19 190048 GAMG: createProl 12 1.0 2.2585e+01 1.0 2.13e+08 1.2 1.3e+08 4.0e+03 1.3e+03 34 16 28 57 50 34 16 28 57 50 149493 Graph 24 1.0 3.3388e+00 1.0 4.58e+06 1.2 7.6e+06 6.7e+02 1.4e+02 5 0 2 1 6 5 0 2 1 6 22196 MIS/Agg 12 1.0 5.6411e-01 1.2 0.00e+00 0.0 4.7e+07 8.7e+02 2.3e+02 1 0 10 4 9 1 0 10 4 9 0 SA: col data 12 1.0 1.2982e+00 1.1 0.00e+00 0.0 5.6e+06 2.0e+03 4.8e+01 2 0 1 1 2 2 0 1 1 2 0 SA: frmProl0 12 1.0 1.6284e+00 1.0 0.00e+00 0.0 3.6e+06 5.5e+02 9.6e+01 2 0 1 0 4 2 0 1 0 4 0 SA: smooth 12 1.0 4.1778e+00 1.0 5.28e+06 1.2 1.4e+07 7.3e+02 1.7e+02 6 0 3 1 7 6 0 3 1 7 20483 GAMG: partLevel 12 1.0 1.3577e+01 1.0 2.90e+07 1.2 2.3e+07 2.1e+03 6.4e+02 20 2 5 5 25 20 2 5 5 25 34470 repartition 9 1.0 1.5048e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.4e+01 2 0 0 0 2 2 0 0 0 2 0 Invert-Sort 9 1.0 1.2282e+00 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 2 0 0 0 1 2 0 0 0 1 0 Move A 9 1.0 2.8930e+00 1.0 0.00e+00 0.0 5.7e+05 8.4e+02 1.5e+02 4 0 0 0 6 4 0 0 0 6 0 Move P 9 1.0 3.0317e+00 1.0 0.00e+00 0.0 9.4e+05 2.5e+01 1.5e+02 5 0 0 0 6 5 0 0 0 6 0 There is nothing that pops out here but you could look at the scaling of the parts. There are no alternative implementations of this and so there is not much that can be done about it. I have not done a deep dive into the performance of this (1) stuff for a very long time. Note, most of this work is amortized because you do it just once for each mesh. So this cost will decrease relatively as you do more solves, like in a real production run. The matrix-matrix (2) stuff is here: MatMatMult 12 1.0 3.5538e+00 1.0 4.58e+06 1.2 1.4e+07 7.3e+02 1.5e+02 5 0 3 1 6 5 0 3 1 6 20853 MatMatMultSym 12 1.0 3.3264e+00 1.0 0.00e+00 0.0 1.2e+07 6.7e+02 1.4e+02 5 0 2 1 6 5 0 2 1 6 0 MatMatMultNum 12 1.0 9.7088e-02 1.1 4.58e+06 1.2 2.5e+06 1.0e+03 0.0e+00 0 0 1 0 0 0 0 1 0 0 763319 MatPtAP 12 1.0 4.0859e+00 1.0 2.90e+07 1.2 2.2e+07 2.2e+03 1.9e+02 6 2 5 5 7 6 2 5 5 7 114537 MatPtAPSymbolic 12 1.0 2.4298e+00 1.1 0.00e+00 0.0 1.4e+07 2.4e+03 8.4e+01 4 0 3 4 3 4 0 3 4 3 0 MatPtAPNumeric 12 1.0 1.7467e+00 1.1 2.90e+07 1.2 8.2e+06 1.8e+03 9.6e+01 3 2 2 2 4 3 2 2 2 4 267927 MatTrnMatMult 12 1.0 7.1406e+00 1.0 1.35e+08 1.2 1.6e+07 2.6e+04 1.9e+02 11 10 3 44 7 11 10 3 44 7 294270 MatTrnMatMultSym 12 1.0 5.6756e+00 1.0 0.00e+00 0.0 1.3e+07 1.6e+04 1.6e+02 9 0 3 23 6 9 0 3 23 6 0 MatTrnMatMultNum 12 1.0 1.5121e+00 1.0 1.35e+08 1.2 2.5e+06 7.9e+04 2.4e+01 2 10 1 21 1 2 10 1 21 1 1389611 Note, the symbolic ("Sym") times (like (1) above) are pretty large, but again they are amortized away and tend to be slow compared to nice numerical kernels loops. Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Mon Nov 19 12:46:43 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Mon, 19 Nov 2018 12:46:43 -0600 Subject: [petsc-users] Solving problem using multigrid In-Reply-To: References: Message-ID: Thanks. So, is it like if I run the same code using -pc_type gamg then it will use algebraic multigrid to solve the problem? I have hypre configured with --download-hypre. Agin, if I want to use geometric multigrid, should I go with domain management (DM)? Now I am planning to go through examples with DM. Will it be a right approach to start? Thanks again. Sincerely, Huq On Sun, Nov 18, 2018 at 4:23 PM Mark Adams wrote: > The manual is pretty clear on this, but your code is purely algebraic, > that is you do not use a DM, and so you will need to use algebraic > multigrid (AMG). So you want to look at AMG preconditioners (-pc_type gamg > [or hypre if you configured with --download-hypre]). > > Mark > > On Sun, Nov 18, 2018 at 1:28 PM Fazlul Huq via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Hello PETSc developers, >> >> I have solved a problem using ksp. Please find the problem in attachment >> 1 and PETSc code for solution in attachment 2. >> Now I need to solve the same problem using multigrid. >> I never solved problem using multigrid. Can you please guide me about >> steps that I should follow >> to solve this problem using multigrid. >> >> Thanks. >> Sincerely, >> Huq >> -- >> >> Fazlul Huq >> Graduate Research Assistant >> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> University of Illinois at Urbana-Champaign (UIUC) >> E-mail: huq2090 at gmail.com >> > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 19 14:13:21 2018 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 19 Nov 2018 15:13:21 -0500 Subject: [petsc-users] Solving problem using multigrid In-Reply-To: References: Message-ID: On Mon, Nov 19, 2018 at 1:48 PM Fazlul Huq via petsc-users < petsc-users at mcs.anl.gov> wrote: > Thanks. > So, is it like if I run the same code using -pc_type gamg then it will use > algebraic multigrid to solve the problem? > Yes. > I have hypre configured with --download-hypre. > Then you can use -pc_type hypre -pc_hypre_type boomeramg > Agin, if I want to use geometric multigrid, should I go with domain > management (DM)? > Yes, or you can manage all the grids yourself and talk to PCMG directly. > Now I am planning to go through examples with DM. Will it be a right > approach to start? > Possibly. Note that GMG is much more involved than AMG. Thanks, Matt > Thanks again. > Sincerely, > Huq > > On Sun, Nov 18, 2018 at 4:23 PM Mark Adams wrote: > >> The manual is pretty clear on this, but your code is purely algebraic, >> that is you do not use a DM, and so you will need to use algebraic >> multigrid (AMG). So you want to look at AMG preconditioners (-pc_type gamg >> [or hypre if you configured with --download-hypre]). >> >> Mark >> >> On Sun, Nov 18, 2018 at 1:28 PM Fazlul Huq via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Hello PETSc developers, >>> >>> I have solved a problem using ksp. Please find the problem in attachment >>> 1 and PETSc code for solution in attachment 2. >>> Now I need to solve the same problem using multigrid. >>> I never solved problem using multigrid. Can you please guide me about >>> steps that I should follow >>> to solve this problem using multigrid. >>> >>> Thanks. >>> Sincerely, >>> Huq >>> -- >>> >>> Fazlul Huq >>> Graduate Research Assistant >>> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>> University of Illinois at Urbana-Champaign (UIUC) >>> E-mail: huq2090 at gmail.com >>> >> > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.appel17 at imperial.ac.uk Mon Nov 19 15:26:37 2018 From: t.appel17 at imperial.ac.uk (Appel, Thibaut) Date: Mon, 19 Nov 2018 21:26:37 +0000 Subject: [petsc-users] On unknown ordering In-Reply-To: References: <3BDDA21B-2493-4706-8F25-738E83C8FF8C@imperial.ac.uk> Message-ID: <4C2DFEA2-E332-4DFB-996F-9C985062477A@ic.ac.uk> Hi Barry, > Le 15 nov. 2018 ? 18:16, Smith, Barry F. a ?crit : > > > >> On Nov 15, 2018, at 4:48 AM, Appel, Thibaut via petsc-users wrote: >> >> Good morning, >> >> I would like to ask about the importance of the initial choice of ordering the unknowns when feeding a matrix to PETSc. >> >> I have a regular grid, using high-order finite differences and I simply divide rows of the matrix with PetscSplitOwnership using vertex major, natural ordering for the parallelism (not using DMDA) > > So each process is getting a slice of the domain? To minimize communication it is best to use "square-ish" subdomains instead of slices; this is why the DMDA tries to use "square-ish" subdomains. I don't know the relationship between convergence rate and the shapes of the subdomains, it will depend on the operator and possibly "flow direction" etc. Yes absolutely, that?s what it is. The MPI ownership follows the rows of the matrix ordered w.r.t. something like row = idof + i*ndof + j*nx*ndof I?m implementing a DMDA interface and will see the effect. >> >> My understanding is that when using LU-MUMPS, this does not matter because either serial or parallel analysis is performed and all the rows are reordered ?optimally? before the LU factorization. Quality of reordering might suffer from parallel analysis. >> >> But if I use the default block Jacobi with ILU with one block per processor, the initial ordering seems to have an influence because some tightly coupled degrees of freedom might lay on different processes and the ILU becomes less powerful. You can change the ordering on each block but this won?t necessarily make things better. >> >> Are my observations accurate? Is there a recommended ordering type for a block Jacobi approach in my case? Could I expect natural improvements in fill-in or better GMRES robustness opting for parallelism offered by DMDA? > > You might consider using -pc_type asm (additive Schwarz method) instead of block Jacobi. This "reintroduces" some of the tight coupling that is discarded when slicing up the domain for block Jacobi. > > Barry > >> >> Thank you, >> >> Thibaut > Thibaut From sajidsyed2021 at u.northwestern.edu Mon Nov 19 15:30:01 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Mon, 19 Nov 2018 15:30:01 -0600 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: <8D3D22BC-EBBC-4BFD-BA74-EB5A78D35027@anl.gov> References: <87in11inru.fsf@jedbrown.org> <87sh04heqm.fsf@jedbrown.org> <8D3D22BC-EBBC-4BFD-BA74-EB5A78D35027@anl.gov> Message-ID: I think what confused me was the fact that using DMCreateMatrix(da,&A) created a 12x12 matrix with 144 elements but summing up nel*nen from each rank gives only 2*2+3*2+3*2+3*2=20 elements. So this means that DMDAGetElements returns the elements for the vector created on the mesh and this happens to be used for indexing the matrix (created using the same DM object) via the local indices. Luckily since this code just creates a tridiagonal matrix, this works here (since if I wanted to create a dense matrix I'd want to have access to all the indices which would happen if the addressable indices at each rank (for the submatrix stored locally) add up to the size of the total matrix). Is my understanding correct? Thanks for the help! -- Sajid Ali Applied Physics Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Nov 19 15:38:12 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Mon, 19 Nov 2018 21:38:12 +0000 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: References: <87in11inru.fsf@jedbrown.org> <87sh04heqm.fsf@jedbrown.org> <8D3D22BC-EBBC-4BFD-BA74-EB5A78D35027@anl.gov> Message-ID: > On Nov 19, 2018, at 3:30 PM, Sajid Ali wrote: > > I think what confused me was the fact that using DMCreateMatrix(da,&A) created a 12x12 matrix with 144 elements but summing up nel*nen from each rank gives only 2*2+3*2+3*2+3*2=20 elements. So this means that DMDAGetElements returns the elements for the vector created on the mesh and this happens to be used for indexing the matrix (created using the same DM object) via the local indices. Luckily since this code just creates a tridiagonal matrix, this works here (since if I wanted to create a dense matrix I'd want to have access to all the indices which would happen if the addressable indices at each rank (for the submatrix stored locally) add up to the size of the total matrix). I don't understand. This code would not work for a dense matrix where each process can set all the values via a local index. It only works when each process sets values for its local and ghost points (where the ghost points are defined by the stencil). DMDA is not intended for problems with completely coupling of unknowns (that is dense matrices). It is meant for matrices arising from finite differences or finite elements. Barry > Is my understanding correct? > > Thanks for the help! > > -- > Sajid Ali > Applied Physics > Northwestern University From sajidsyed2021 at u.northwestern.edu Mon Nov 19 15:43:33 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Mon, 19 Nov 2018 15:43:33 -0600 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: References: <87in11inru.fsf@jedbrown.org> <87sh04heqm.fsf@jedbrown.org> <8D3D22BC-EBBC-4BFD-BA74-EB5A78D35027@anl.gov> Message-ID: So, DMDA is used for sparse matrices arising from FD/FE and MatCreateMPIAIJ can be used for dense matrices (though it is strongly discouraged). My confusion stemmed from DMDAGetElements giving the element indices for the 1D mesh/vector of size N(when the DMDACreate1d is used). But these indices were then used as local indices to set the values for a 2D matrix (via MatSetValuesLocal ) created using the same DM (now we have NxN elements). -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Nov 19 15:57:25 2018 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 19 Nov 2018 16:57:25 -0500 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: References: <87in11inru.fsf@jedbrown.org> <87sh04heqm.fsf@jedbrown.org> <8D3D22BC-EBBC-4BFD-BA74-EB5A78D35027@anl.gov> Message-ID: You seem to be confusing the degree of the mesh and the "degree" of the matrix and vector. A matrix is always N x M (2D if you like), a vector is always N (or 1 x N, or 1D if you like). The mesh in a DM or DA can be 1, 2 or 3D. On Mon, Nov 19, 2018 at 4:44 PM Sajid Ali via petsc-users < petsc-users at mcs.anl.gov> wrote: > So, DMDA is used for sparse matrices arising from FD/FE and > MatCreateMPIAIJ can be used for dense matrices (though it is strongly > discouraged). > > My confusion stemmed from DMDAGetElements giving the element indices for > the 1D mesh/vector of size N(when the DMDACreate1d is used). But these > indices were then used as local indices to set the values for a 2D matrix > (via MatSetValuesLocal ) created using the same DM (now we have NxN > elements). > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajidsyed2021 at u.northwestern.edu Mon Nov 19 16:15:49 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Mon, 19 Nov 2018 16:15:49 -0600 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: References: <87in11inru.fsf@jedbrown.org> <87sh04heqm.fsf@jedbrown.org> <8D3D22BC-EBBC-4BFD-BA74-EB5A78D35027@anl.gov> Message-ID: Bingo! So, DMDAGetElements gives the indices of the mesh, right ? Thank you ! -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Nov 19 16:22:08 2018 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 19 Nov 2018 17:22:08 -0500 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: References: <87in11inru.fsf@jedbrown.org> <87sh04heqm.fsf@jedbrown.org> <8D3D22BC-EBBC-4BFD-BA74-EB5A78D35027@anl.gov> Message-ID: The local indices of the local mesh and local vectors, which includes ghost vertices. There are global-to-local methods to fill in ghost values and local-to-global methods to create global vectors that you can use for computation. On Mon, Nov 19, 2018 at 5:16 PM Sajid Ali wrote: > Bingo! > > So, DMDAGetElements gives the indices of the mesh, right ? > > > Thank you ! > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajidsyed2021 at u.northwestern.edu Mon Nov 19 18:10:48 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Mon, 19 Nov 2018 18:10:48 -0600 Subject: [petsc-users] Question about DMDAGetElements In-Reply-To: References: <87in11inru.fsf@jedbrown.org> <87sh04heqm.fsf@jedbrown.org> <8D3D22BC-EBBC-4BFD-BA74-EB5A78D35027@anl.gov> Message-ID: Got it. Thank you ! On Mon, Nov 19, 2018, 16:22 Mark Adams The local indices of the local mesh and local vectors, which includes > ghost vertices. There are global-to-local methods to fill in ghost values > and local-to-global methods to create global vectors that you can use for > computation. > > On Mon, Nov 19, 2018 at 5:16 PM Sajid Ali < > sajidsyed2021 at u.northwestern.edu> wrote: > >> Bingo! >> >> So, DMDAGetElements gives the indices of the mesh, right ? >> >> >> Thank you ! >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From k_burkart at yahoo.com Tue Nov 20 06:12:08 2018 From: k_burkart at yahoo.com (Klaus Burkart) Date: Tue, 20 Nov 2018 12:12:08 +0000 (UTC) Subject: [petsc-users] Can't initialize petsc from within application code References: <983177586.161893.1542715928403.ref@mail.yahoo.com> Message-ID: <983177586.161893.1542715928403@mail.yahoo.com> Hello, I am trying to initialize petsc from within application code but I keep getting the following error. symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libmyPCG2.so: undefined symbol: PetscInitialize The code is just: ??? PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); ??? //PetscPrintf(PETSC_COMM_WORLD,"Hello World\n"); ??? PetscFinalize(); includes are: #include #include There are no compile errors. .bashrc petsc related content is: export PETSC_DIR=/home/klaus/OpenFOAM/ThirdParty-5.0/petsc-3.9.3 export PETSC_ARCH=/arch-linux2-c-debug export PETSC_LIBDIR=$PETSC_DIR/arch-linux2-c-debug/lib I am also linking: ??? -I$(PETSC_DIR)/include \ ??? -I$(PETSC_DIR)/arch-linux2-c-debug/include and ??? -L$(PETSC_LIBDIR) -lpetsc I assume something else needs to be linked but can't work out what's missing. Any idea? Klaus -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Tue Nov 20 06:20:09 2018 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Tue, 20 Nov 2018 15:20:09 +0300 Subject: [petsc-users] Can't initialize petsc from within application code In-Reply-To: <983177586.161893.1542715928403@mail.yahoo.com> References: <983177586.161893.1542715928403.ref@mail.yahoo.com> <983177586.161893.1542715928403@mail.yahoo.com> Message-ID: Depending on your preferences, you may want to use rpath https://en.wikipedia.org/wiki/Rpath to link libpetsc.so, or add the location of the PETSc libraries (i.e. where the make variable PETSC_LIBDIR points to) to LD_LIBRARY_FLAG. I would suggest the first Il giorno mar 20 nov 2018 alle ore 15:12 Klaus Burkart via petsc-users < petsc-users at mcs.anl.gov> ha scritto: > Hello, > > I am trying to initialize petsc from within application code but I keep > getting the following error. > > symbol lookup error: > /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libmyPCG2.so: > undefined symbol: PetscInitialize > > The code is just: > > PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); > //PetscPrintf(PETSC_COMM_WORLD,"Hello World\n"); > PetscFinalize(); > > includes are: > > #include > #include > > There are no compile errors. > > .bashrc petsc related content is: > export PETSC_DIR=/home/klaus/OpenFOAM/ThirdParty-5.0/petsc-3.9.3 > export PETSC_ARCH=/arch-linux2-c-debug > export PETSC_LIBDIR=$PETSC_DIR/arch-linux2-c-debug/lib > > > I am also linking: > -I$(PETSC_DIR)/include \ > -I$(PETSC_DIR)/arch-linux2-c-debug/include > and > -L$(PETSC_LIBDIR) -lpetsc > > I assume something else needs to be linked but can't work out what's > missing. > > Any idea? > > > Klaus > -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From jcrean01 at gmail.com Tue Nov 20 10:40:45 2018 From: jcrean01 at gmail.com (Jared Crean) Date: Tue, 20 Nov 2018 11:40:45 -0500 Subject: [petsc-users] Mat Off Proc Entries Message-ID: <3c0e824c-ac85-c601-1daf-35591cae781c@gmail.com> Hello, ? What is the difference between MAG_IGNORE_OFF_PROC_ENTRIES and MAT_NO_OFF_PROC_ENTRIES?? The docs describe the first one as dropping off proc entries, and the second one talks about avoiding reductions in MatAssembly routines.? If I have a matrix where each process only assembles into the local part, should I set one or both of the flags? ??? Jared Crean From knepley at gmail.com Tue Nov 20 10:53:20 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 20 Nov 2018 11:53:20 -0500 Subject: [petsc-users] Mat Off Proc Entries In-Reply-To: <3c0e824c-ac85-c601-1daf-35591cae781c@gmail.com> References: <3c0e824c-ac85-c601-1daf-35591cae781c@gmail.com> Message-ID: On Tue, Nov 20, 2018 at 11:41 AM Jared Crean via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hello, > > What is the difference between MAG_IGNORE_OFF_PROC_ENTRIES and > MAT_NO_OFF_PROC_ENTRIES? The docs describe the first one as dropping > off proc entries, and the second one talks about avoiding reductions in > MatAssembly routines. If I have a matrix where each process only > assembles into the local part, should I set one or both of the flags? > I believe they are operating in different places. IGNORE_OFF_PROC_ENTRIES is used in MatSetValues(), where any values not owned by this process is ignored instead of put in the stash. However, MatAssemblyBegin() still does a reduction even if stashes are empty. NO_OFF_PROC_ENTRIES tells AssemblyBegin() to skip the reduction and do no communication at all. If you have a purely local matrix, then both make sense. Thanks, Matt > Jared Crean -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From avent at brasserieardennaise.be Tue Nov 20 16:42:16 2018 From: avent at brasserieardennaise.be (Nina) Date: Tue, 20 Nov 2018 14:42:16 -0800 Subject: [petsc-users] I want to experiment in bed with you. Message-ID: <1DE7C132C1153B98444108B591301B70@brasserieardennaise.be> https://dfiqwxy.biz/?dfiqwxy dfiqwxy http://dimprw.net/?dimprw dimprw https://imqtwxyz.info/?imqtwxyz imqtwxyz http://bdhkmnst.ua/?bdhkmnst bdhkmnst http://ipqsu.info/?ipqsu ipqsu http://fhijkoqr.net/?fhijkoqr fhijkoqr https://acfiprs.ua/?acfiprs acfiprs https://bfgjmnps.info/?bfgjmnps bfgjmnps https://afsuvz.net/?afsuvz afsuvz http://bkltuwxy.net/?bkltuwxy bkltuwxy https://agilop.ru/?agilop agilop http://adinqtuw.ru/?adinqtuw adinqtuw http://cemnptvz.biz/?cemnptvz cemnptvz http://chjmoty.ua/?chjmoty chjmoty http://ajnrvx.info/?ajnrvx ajnrvx http://aghlnoux.info/?aghlnoux aghlnoux http://alpvw.biz/?alpvw alpvw http://bdklt.biz/?bdklt bdklt https://gmtwz.net/?gmtwz gmtwz http://bfiktu.net/?bfiktu bfiktu http://cfilqyz.info/?cfilqyz cfilqyz https://bfgkn.ru/?bfgkn bfgkn http://adglo.net/?adglo adglo https://acfhmrz.ua/?acfhmrz acfhmrz http://cdhil.ua/?cdhil cdhil https://afilorz.biz/?afilorz afilorz http://acqsvy.ru/?acqsvy acqsvy http://begjst.info/?begjst begjst http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://dfiqwxy.biz/?dfiqwxy dfiqwxy http://dimprw.net/?dimprw dimprw https://imqtwxyz.info/?imqtwxyz imqtwxyz http://bdhkmnst.ua/?bdhkmnst bdhkmnst http://ipqsu.info/?ipqsu ipqsu http://fhijkoqr.net/?fhijkoqr fhijkoqr https://acfiprs.ua/?acfiprs acfiprs https://bfgjmnps.info/?bfgjmnps bfgjmnps https://afsuvz.net/?afsuvz afsuvz http://bkltuwxy.net/?bkltuwxy bkltuwxy https://agilop.ru/?agilop agilop http://adinqtuw.ru/?adinqtuw adinqtuw http://cemnptvz.biz/?cemnptvz cemnptvz http://chjmoty.ua/?chjmoty chjmoty http://ajnrvx.info/?ajnrvx ajnrvx http://aghlnoux.info/?aghlnoux aghlnoux http://alpvw.biz/?alpvw alpvw http://bdklt.biz/?bdklt bdklt https://gmtwz.net/?gmtwz gmtwz http://bfiktu.net/?bfiktu bfiktu http://cfilqyz.info/?cfilqyz cfilqyz https://bfgkn.ru/?bfgkn bfgkn http://adglo.net/?adglo adglo https://acfhmrz.ua/?acfhmrz acfhmrz http://cdhil.ua/?cdhil cdhil https://afilorz.biz/?afilorz afilorz http://acqsvy.ru/?acqsvy acqsvy http://begjst.info/?begjst begjst http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy Hey gorgeous. I?ve seen, met u in caffe last weekend but I could not come . You?re sex. My pictures, photos are on that website -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Tue Nov 20 16:44:36 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Tue, 20 Nov 2018 16:44:36 -0600 Subject: [petsc-users] Problem to configure Message-ID: Hello PETSc Developers, I am trying to configure PETSc with ./configure --download-hypre but I got the following message on terminal: ======================================================================== Configuring PETSc to compile on your system ========================================================================= TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:158) ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- Unable to find mpi in default locations! Perhaps you can specify with --with-mpi-dir= If you do not want MPI, then give --with-mpi=0 You might also consider using --download-mpich instead ******************************************************************************* The log file is attached herewith. Thanks. Sincerely, Huq -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 2298821 bytes Desc: not available URL: From balay at mcs.anl.gov Tue Nov 20 16:59:09 2018 From: balay at mcs.anl.gov (Balay, Satish) Date: Tue, 20 Nov 2018 22:59:09 +0000 Subject: [petsc-users] Problem to configure In-Reply-To: References: Message-ID: As the message suggests - you don't have MPI installed - so run: ./configure --download-hypre --download-mpich Satish On Tue, 20 Nov 2018, Fazlul Huq via petsc-users wrote: > Hello PETSc Developers, > > I am trying to configure PETSc with ./configure --download-hypre > but I got the following message on terminal: > > ======================================================================== > Configuring PETSc to compile on your system > > ========================================================================= > TESTING: check from > config.libraries(config/BuildSystem/config/libraries.py:158) > > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > ------------------------------------------------------------------------------- > Unable to find mpi in default locations! > Perhaps you can specify with --with-mpi-dir= > If you do not want MPI, then give --with-mpi=0 > You might also consider using --download-mpich instead > ******************************************************************************* > > The log file is attached herewith. > > Thanks. > Sincerely, > Huq > From huq2090 at gmail.com Tue Nov 20 18:06:14 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Tue, 20 Nov 2018 18:06:14 -0600 Subject: [petsc-users] Problem to configure In-Reply-To: References: Message-ID: Thanks. ./configure --download-hypre --download-mpich works well. But when I run the follwoing: make PETSC_DIR=/home/huq2090/petsc-3.10.2 PETSC_ARCH=arch-linux2-c-debug all I got following errors: $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$ CLINKER arch-linux2-c-debug/lib/libpetsc.so.3.10.2 arch-linux2-c-debug/obj/mat/impls/baij/seq/baij.o: In function `MatCreate_SeqBAIJ': /home/huq2090/petsc-3.10.2/src/mat/impls/baij/seq/baij.c:3049: undefined reference to `MatConvert_AIJ_HYPRE' /usr/bin/ld: arch-linux2-c-debug/obj/mat/impls/baij/seq/baij.o: relocation R_X86_64_PC32 against undefined hidden symbol `MatConvert_AIJ_HYPRE' can not be used when making a shared object /usr/bin/ld: final link failed: Bad value collect2: error: ld returned 1 exit status gmakefile:86: recipe for target 'arch-linux2-c-debug/lib/libpetsc.so.3.10.2' failed make[2]: *** [arch-linux2-c-debug/lib/libpetsc.so.3.10.2] Error 1 make[2]: Leaving directory '/home/huq2090/petsc-3.10.2' /home/huq2090/petsc-3.10.2/lib/petsc/conf/rules:81: recipe for target 'gnumake' failed make[1]: *** [gnumake] Error 2 make[1]: Leaving directory '/home/huq2090/petsc-3.10.2' **************************ERROR************************************* Error during compile, check arch-linux2-c-debug/lib/petsc/conf/make.log Send it and arch-linux2-c-debug/lib/petsc/conf/configure.log to petsc-maint at mcs.anl.gov ******************************************************************** makefile:30: recipe for target 'all' failed make: *** [all] Error 1 $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$ Again, the log file is attached herewith. Note: I have moose installed in my laptop but I have commented out from bashrc file #moose #if [ -f /opt/moose/environments/moose_profile ]; then # . /opt/moose/environments/moose_profile #fi Thanks. Sincerely, Huq On Tue, Nov 20, 2018 at 4:59 PM Balay, Satish wrote: > As the message suggests - you don't have MPI installed - so run: > > ./configure --download-hypre --download-mpich > > Satish > > On Tue, 20 Nov 2018, Fazlul Huq via petsc-users wrote: > > > Hello PETSc Developers, > > > > I am trying to configure PETSc with ./configure --download-hypre > > but I got the following message on terminal: > > > > ======================================================================== > > Configuring PETSc to compile on your system > > > > ========================================================================= > > TESTING: check from > > config.libraries(config/BuildSystem/config/libraries.py:158) > > > > > ******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > details): > > > ------------------------------------------------------------------------------- > > Unable to find mpi in default locations! > > Perhaps you can specify with --with-mpi-dir= > > If you do not want MPI, then give --with-mpi=0 > > You might also consider using --download-mpich instead > > > ******************************************************************************* > > > > The log file is attached herewith. > > > > Thanks. > > Sincerely, > > Huq > > > > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: text/x-log Size: 101932 bytes Desc: not available URL: From balay at mcs.anl.gov Tue Nov 20 18:17:15 2018 From: balay at mcs.anl.gov (Balay, Satish) Date: Wed, 21 Nov 2018 00:17:15 +0000 Subject: [petsc-users] Problem to configure In-Reply-To: References: Message-ID: Try: rm -rf arch-linux2-c-debug And rebuild petsc. ie. redo configure and make Satish We don't know why this On Tue, 20 Nov 2018, Fazlul Huq via petsc-users wrote: > Thanks. > ./configure --download-hypre --download-mpich works well. > But when I run the follwoing: > make PETSC_DIR=/home/huq2090/petsc-3.10.2 PETSC_ARCH=arch-linux2-c-debug all > > I got following errors: > $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$ > CLINKER arch-linux2-c-debug/lib/libpetsc.so.3.10.2 > arch-linux2-c-debug/obj/mat/impls/baij/seq/baij.o: In function > `MatCreate_SeqBAIJ': > /home/huq2090/petsc-3.10.2/src/mat/impls/baij/seq/baij.c:3049: undefined > reference to `MatConvert_AIJ_HYPRE' > /usr/bin/ld: arch-linux2-c-debug/obj/mat/impls/baij/seq/baij.o: relocation > R_X86_64_PC32 against undefined hidden symbol `MatConvert_AIJ_HYPRE' can > not be used when making a shared object > /usr/bin/ld: final link failed: Bad value > collect2: error: ld returned 1 exit status > gmakefile:86: recipe for target > 'arch-linux2-c-debug/lib/libpetsc.so.3.10.2' failed > make[2]: *** [arch-linux2-c-debug/lib/libpetsc.so.3.10.2] Error 1 > make[2]: Leaving directory '/home/huq2090/petsc-3.10.2' > /home/huq2090/petsc-3.10.2/lib/petsc/conf/rules:81: recipe for target > 'gnumake' failed > make[1]: *** [gnumake] Error 2 > make[1]: Leaving directory '/home/huq2090/petsc-3.10.2' > **************************ERROR************************************* > Error during compile, check arch-linux2-c-debug/lib/petsc/conf/make.log > Send it and arch-linux2-c-debug/lib/petsc/conf/configure.log to > petsc-maint at mcs.anl.gov > ******************************************************************** > makefile:30: recipe for target 'all' failed > make: *** [all] Error 1 > $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$ > > Again, the log file is attached herewith. > > Note: I have moose installed in my laptop but I have commented out from > bashrc file > #moose > #if [ -f /opt/moose/environments/moose_profile ]; then > # . /opt/moose/environments/moose_profile > #fi > > Thanks. > Sincerely, > Huq > > On Tue, Nov 20, 2018 at 4:59 PM Balay, Satish wrote: > > > As the message suggests - you don't have MPI installed - so run: > > > > ./configure --download-hypre --download-mpich > > > > Satish > > > > On Tue, 20 Nov 2018, Fazlul Huq via petsc-users wrote: > > > > > Hello PETSc Developers, > > > > > > I am trying to configure PETSc with ./configure --download-hypre > > > but I got the following message on terminal: > > > > > > ======================================================================== > > > Configuring PETSc to compile on your system > > > > > > ========================================================================= > > > TESTING: check from > > > config.libraries(config/BuildSystem/config/libraries.py:158) > > > > > > > > ******************************************************************************* > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > details): > > > > > ------------------------------------------------------------------------------- > > > Unable to find mpi in default locations! > > > Perhaps you can specify with --with-mpi-dir= > > > If you do not want MPI, then give --with-mpi=0 > > > You might also consider using --download-mpich instead > > > > > ******************************************************************************* > > > > > > The log file is attached herewith. > > > > > > Thanks. > > > Sincerely, > > > Huq > > > > > > > > > From mfadams at lbl.gov Tue Nov 20 22:08:00 2018 From: mfadams at lbl.gov (Mark Adams) Date: Tue, 20 Nov 2018 23:08:00 -0500 Subject: [petsc-users] Nullspace In-Reply-To: <101b8661-a532-ab01-44b5-bb7b840604d6@berkeley.edu> References: <53d5eccd-165f-4260-85c6-481ef6ec8531@berkeley.edu> <101b8661-a532-ab01-44b5-bb7b840604d6@berkeley.edu> Message-ID: Yes, that's right. Also, it is best to send this to the list, like petsc-users, for a few reasons. It is useful for the core developers, as well as any lurkers really, to casually see what people are doing. And the threads are archived and are a good source of deep documentation. eg, one could search for these methods to see any discussion on them if you want to go beyond the documentation. Your question is a good question. Could be a FAQ (even if not that frequent). Mark On Tue, Nov 20, 2018 at 10:32 PM Sanjay Govindjee wrote: > Thanks. I will give this a try and see what makes best sense. In the > Thermo-Elastic case I think it should be easy. > The vectors are the 3 translations, the 3 rotations (all on the first 3 > dofs), then a 'translation' on the 4th dof. > > I'll let you know if it works or not. > > -sanjay > > ------------------------------------------------------------------- > Sanjay Govindjee, PhD, PE > Horace, Dorothy, and Katherine Johnson Professor in Engineering > > 779 Davis Hall > University of California > Berkeley, CA 94720-1710 > > Voice: +1 510 642 6060 > FAX: +1 510 643 5264s_g at berkeley.eduhttp://faculty.ce.berkeley.edu/sanjay > ------------------------------------------------------------------- > > Books: > > Engineering Mechanics of Deformable Solidshttp://amzn.com/0199651647 > > Engineering Mechanics 3 (Dynamics) 2nd Editionhttp://amzn.com/3642537111 > > Engineering Mechanics 3, Supplementary Problems: Dynamics http://www.amzn.com/B00SOXN8JU > > NSF NHERI SimCenterhttps://simcenter.designsafe-ci.org/ > ------------------------------------------------------------------- > > > On 11/20/18 10:39 AM, Mark Adams wrote: > > PCSetCoordinates is essentially deprecated for this reason. Fragile. > > The recommended way is to use MatSetNearNullSpace. There is a helper > function MatNullSpaceCreateRigidBody that takes coordinates that is a > reroll of PCSetCoordinates basically, but you can not use that. > > 1) You can create the null space yourself and give it to the matrix. Here > is a sample code: > > if (nearnulldim) { > MatNullSpace nullsp; > Vec *nullvecs; > PetscInt i; > ierr = PetscMalloc1(nearnulldim,&nullvecs);CHKERRQ(ierr); > for (i=0; i ierr = VecCreate(PETSC_COMM_WORLD,&nullvecs[i]);CHKERRQ(ierr); > ierr = VecLoad(nullvecs[i],viewer);CHKERRQ(ierr); > } > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_FALSE,nearnulldim,nullvecs,&nullsp);CHKERRQ(ierr); > ierr = MatSetNearNullSpace(A,nullsp);CHKERRQ(ierr); > for (i=0; i ierr = PetscFree(nullvecs);CHKERRQ(ierr); > ierr = MatNullSpaceDestroy(&nullsp);CHKERRQ(ierr); > } > > You have nearnulldim = 2*D + 1. You need to replace VecLoad here with > code that sets the null space explicitly. > 2) Here is another way that would use PETSc's RBM constructor. Here is the > recommended way to set the null space now with PETSc's RBMs: > if (use_nearnullspace) { MatNullSpace matnull; Vec vec_coords; PetscScalar > *c; ierr = VecCreate(MPI_COMM_WORLD,&vec_coords);CHKERRQ(ierr); ierr = > VecSetBlockSize(vec_coords,3);CHKERRQ(ierr); ierr = > VecSetSizes(vec_coords,m,PETSC_DECIDE);CHKERRQ(ierr); ierr = > VecSetUp(vec_coords);CHKERRQ(ierr); ierr = > VecGetArray(vec_coords,&c);CHKERRQ(ierr); for (i=0; i coords[i]; /* Copy since Scalar type might be Complex */ ierr = > VecRestoreArray(vec_coords,&c);CHKERRQ(ierr); ierr = > MatNullSpaceCreateRigidBody(vec_coords,&matnull);CHKERRQ(ierr); ierr = > MatSetNearNullSpace(Amat,matnull);CHKERRQ(ierr); ierr = > MatNullSpaceDestroy(&matnull);CHKERRQ(ierr); ierr = > VecDestroy(&vec_coords);CHKERRQ(ierr); } else { > Now there is a method to get the vectors out of the null space: > https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatNullSpaceGetVecs.html > > Now you can create the RBMs (6) with thus second code and the get the > vectors with MatNullSpaceGetVecs -- giving MatNullSpaceGetVecs a vector > array of size 7. This will give you the 6 RBMs. Now you just need to > manually create and set the 7th vector with a basis vector for your thermal > dof. I assume that this is the correct null space mathematically. Then use > that in MatNullSpaceCreate. > > You could use either approach. > > Note, I don't know if all this has been tested well with > 6 null > vectors, so we may need to fix bug(s). > > Mark > > > > > > On Tue, Nov 20, 2018 at 1:03 PM Sanjay Govindjee wrote: > >> Mark, >> In parFEAP we have been using GAMG together with PCSetCoordinates. >> This works fine for mechanical problems, but >> one of our users wants to solve thermo-elastic problems (4 - dofs per >> node). The matrix now has a block size of 4 >> and GAMG is complaining: >> >> [0]PETSC ERROR: Petsc has generated inconsistent data >> [0]PETSC ERROR: Don't know how to create null space for ndm=3, ndf=4. Use MatSetNearNullSpace. >> [0]PETSC ERROR: #1 PCSetCoordinates_AGG() line 196 in /home/user/Software/petsc-3.9.2/src/ksp/pc/impls/gamg/agg.c >> [0]PETSC ERROR: #2 PCSetCoordinates() line 1876 in /home/user/Software/petsc-3.9.2/src/ksp/pc/interface/precon.c >> >> I looked at the docs for how to try to set the null space but it was not fully clear to me what is the best way >> to do this. Can you give me some hints here? >> >> -sanjay >> >> -- >> ------------------------------------------------------------------- >> Sanjay Govindjee, PhD, PE >> Horace, Dorothy, and Katherine Johnson Professor in Engineering >> >> 779 Davis Hall >> University of California >> Berkeley, CA 94720-1710 >> >> Voice: +1 510 642 6060 >> FAX: +1 510 643 5264s_g at berkeley.eduhttp://faculty.ce.berkeley.edu/sanjay >> ------------------------------------------------------------------- >> >> Books: >> >> Engineering Mechanics of Deformable Solidshttp://amzn.com/0199651647 >> >> Engineering Mechanics 3 (Dynamics) 2nd Editionhttp://amzn.com/3642537111 >> >> Engineering Mechanics 3, Supplementary Problems: Dynamics http://www.amzn.com/B00SOXN8JU >> >> NSF NHERI SimCenterhttps://simcenter.designsafe-ci.org/ >> ------------------------------------------------------------------- >> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dalcinl at gmail.com Wed Nov 21 09:46:20 2018 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Wed, 21 Nov 2018 18:46:20 +0300 Subject: [petsc-users] [WARNING: UNSCANNABLE EXTRACTION FAILED]Re: I want to experiment in bed with you. In-Reply-To: <1DE7C132C1153B98444108B591301B70@brasserieardennaise.be> References: <1DE7C132C1153B98444108B591301B70@brasserieardennaise.be> Message-ID: Dear Nina, Although we are supposed to be empathetic with you, but I'm afraid you are violating our Code of Conduct. Warm regards, On Wed, 21 Nov 2018 at 01:42, Nina via petsc-users wrote: > https://dfiqwxy.biz/?dfiqwxy dfiqwxy http://dimprw.net/?dimprw dimprw > https://imqtwxyz.info/?imqtwxyz imqtwxyz http://bdhkmnst.ua/?bdhkmnst > bdhkmnst http://ipqsu.info/?ipqsu ipqsu http://fhijkoqr.net/?fhijkoqr > fhijkoqr https://acfiprs.ua/?acfiprs acfiprs > https://bfgjmnps.info/?bfgjmnps bfgjmnps https://afsuvz.net/?afsuvz > afsuvz http://bkltuwxy.net/?bkltuwxy bkltuwxy https://agilop.ru/?agilop > agilop http://adinqtuw.ru/?adinqtuw adinqtuw http://cemnptvz.biz/?cemnptvz > cemnptvz http://chjmoty.ua/?chjmoty chjmoty http://ajnrvx.info/?ajnrvx > ajnrvx http://aghlnoux.info/?aghlnoux aghlnoux http://alpvw.biz/?alpvw > alpvw http://bdklt.biz/?bdklt bdklt https://gmtwz.net/?gmtwz gmtwz > http://bfiktu.net/?bfiktu bfiktu http://cfilqyz.info/?cfilqyz cfilqyz > https://bfgkn.ru/?bfgkn bfgkn http://adglo.net/?adglo adglo > https://acfhmrz.ua/?acfhmrz acfhmrz http://cdhil.ua/?cdhil cdhil > https://afilorz.biz/?afilorz afilorz http://acqsvy.ru/?acqsvy acqsvy > http://begjst.info/?begjst begjst http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz > http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz > http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw > http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx > cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu > diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy > ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx > https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv > https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr > http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx > http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz > http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq > http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz > https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy > https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx > https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz > afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz > https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz > https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv > http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr > achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv > acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw > kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux > https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy > dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz > cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr > http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw > http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz > http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw > https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx > http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz > dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq > https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx > fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu > bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry > acejknry http://cehquvxy.net/?cehquvxy cehquvxy > https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx > http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm > https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz > dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw > egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv > jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw > eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz > bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy > https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv > https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy > http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr > http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk > https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory > https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz > https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz > https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx > chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz > http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz > http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy > http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz > http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy > http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw > https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox > https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw > https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy > http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu > bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv > https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv > http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz > http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv > https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx > http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs > https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv > http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy > http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw > http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy > http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz > https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy > bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw > bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy > bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu > cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru > cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw > efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps > http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy > abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky > https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv > http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw > https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv > https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx > https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz > http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz > http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz > https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy > http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx > http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz > bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv > adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt > bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw > bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz > gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv > dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz > denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz > drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu > fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz > mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy > dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz > https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz > http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry > https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz > http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw > https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw > https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy > gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw > https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy > bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw > https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv > https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz > jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz > aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw > http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv > https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy > https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu > https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy > https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln > abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw > bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst > afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx > bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz > emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr > https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz > https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx > https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu > cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu > http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx > https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst > https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv > cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz > bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw > kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy > https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw > dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy > gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux > http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy > https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq > beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr > http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz > http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx > http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy > fklnovy http://chkmosux.net/?chkmosux chkmosux > https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz > https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx > http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny > bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw > http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw > aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv > agiluv https://befimsu.biz/?befimsu befimsu > https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz > bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt > abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw > adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru > abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx > https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz > ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz > lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz > http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr > abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw > http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx > bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz > bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz > ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx > acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy > https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw > afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz > fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz > ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw > https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv > https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw > http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz > efjnoqrz https://achkqst.info/?achkqst achkqst > http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv > https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq > https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy > http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr > http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps > https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu > http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz > http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz > ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs > http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv > https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw > bfhjlsw https://iknortux.com/?iknortux iknortux > http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz > cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz > http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv > bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw > cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy > http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx > http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy > http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz > http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz > http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz > http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz > http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz > http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx > https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt > http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu > https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz > https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx > https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv > http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz > https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs > gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz > https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors > https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz > https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx > https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy > http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv > https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu > https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz > https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz > https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw > http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs > https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw > http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu > https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu > fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy > https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy > ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty > jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy > https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw > egijstw http://cdikopvz.info/?cdikopvz cdikopvz > https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw > fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu > efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv > https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy > bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu > aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy > behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp > http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz > http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz > http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz > http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors > http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux > http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz > http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr > http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv > https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz > http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw > https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy > https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty > https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu > http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv > https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy > https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy > http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst > http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw > https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy > fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy > cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz > elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou > dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw > ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx > gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw > http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty > https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz > http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv > http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz > http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy > bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry > fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq > http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw > https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz > http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw > aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy > aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy > http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou > https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx > http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu > https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv > http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz > https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs > bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw > https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx > https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu > https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz > https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw > ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz > einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz > adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw > ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy > bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx > dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz > http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz > http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx > https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy > https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz > cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy > https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy > gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy > kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx > cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy > https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw > abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz > ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu > https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz > http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns > https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp > https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw > http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz > http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw > http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry > bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy > bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz > ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx > https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx > http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx > http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz > https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox > cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy > acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz > fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty > aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu > http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox > https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz > http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy > gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux > afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx > bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy > aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv > fijqrv https://ahmquvw.info/?ahmquvw ahmquvw > https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty > https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy > ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx > agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz > hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx > http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy > https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz > http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq > https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor > https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz > acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu > dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz > hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy > https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv > http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr > http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu > http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv > https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv > http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr > http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv > https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx > http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq > https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv > https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx > ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz > bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx > https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty > https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw > http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx > https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz > https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv > http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz > http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz > https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw > http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs > https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx > https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz > https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn > https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy > http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov > https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv > https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu > abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy > https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno > aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw > aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz > https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx > https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu > https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux > http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw > https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps > https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns > https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz > https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx > http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx > https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz > https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz > http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv > acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu > cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw > ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn > http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz > http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv > https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz > http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux > egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz > bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv > https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs > http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu > http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz > http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx > https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw > http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo > https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz > http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv > deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw > bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy > https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky > https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy > ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw > https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz > dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy > bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw > bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx > abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln > fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory > hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx > https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt > https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv > https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx > bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy > dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy > gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy > http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw > https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw > https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty > http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw > http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz > http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw > ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu > hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns > adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv > http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu > https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz > https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx > https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv > http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp > cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz > adglpwz http://dehimosu.biz/?dehimosu dehimosu > https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory > http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy > http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy > https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz > http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv > http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv > http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy > http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz > https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru > http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy > http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy > http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux > https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz > dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx > http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy > https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv > https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux > https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv > http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz > flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx > https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs > http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw > http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns > https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu > http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz > https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw > https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw > https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw > aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv > abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw > flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw > defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy > http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz > http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs > http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy > binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv > cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv > cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw > http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw > http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw > https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv > hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx > https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp > http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz > https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv > http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz > http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv > http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz > http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx > https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt > https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz > https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw > https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy > https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy > adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy > defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx > cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz > behioqyz https://admrsvxz.com/?admrsvxz admrsvxz > https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx > http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst > http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz > adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs > chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy > gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx > bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv > bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx > adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx > http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz > http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz > http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno > http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz > https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy > http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz > https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw > https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz > http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz > https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty > https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr > https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz > https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw > http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru > http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy > http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz > http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy > https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny > http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx > http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow > https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw > http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz > https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu > http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy > http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy > dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor > bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz > bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv > begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz > https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx > https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz > http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy > bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu > https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx > http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy > https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry > https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst > https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy > https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu > ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo > http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz > http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw > https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps > http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu > https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz > http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv > http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx > https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu > https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst > demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu > achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq > adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz > cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux > egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz > ehopyz https://ehijptuy.com/?ehijptuy ehijptuy > http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw > http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz > https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw > http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv > https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy > eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx > acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs > http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz > http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz > http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv > https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx > efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino > adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy > http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz > http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy > https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv > https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw > https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy > https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw > bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy > https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz > cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz > https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz > https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz > https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru > https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz > https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz > https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy > abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw > https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx > http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks > https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv > http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu > https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu > http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy > http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu > http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu > fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz > http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy > adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw > ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux > hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm > adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy > dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv > http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx > http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz > http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw > https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv > https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy > http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz > https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy > dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz > https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz > cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz > http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz > chjpqvyz https://achijmr.com/?achijmr achijmr > https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv > http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw > https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux > https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy > dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz > cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr > http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw > http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz > http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw > https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx > http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz > dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq > https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx > fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu > bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry > acejknry http://cehquvxy.net/?cehquvxy cehquvxy > https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx > http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm > https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz > dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw > egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv > jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw > eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz > bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy > https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv > https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy > http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr > http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk > https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory > https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz > https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz > https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx > chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz > http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz > http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy > http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz > http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy > http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw > https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox > https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw > https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy > http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu > bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv > https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv > http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz > http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv > https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx > http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs > https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv > http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy > http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw > http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy > http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz > https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy > bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw > bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy > bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu > cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru > cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw > efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps > http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy > abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky > https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv > http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw > https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv > https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx > https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz > http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz > http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz > https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy > http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx > http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz > bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv > adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt > bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw > bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz > gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv > dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz > denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz > drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu > fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz > mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy > dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz > https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz > http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry > https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz > http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw > https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw > https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy > gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw > https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy > bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw > https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv > https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz > jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz > aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw > http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv > https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy > https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu > https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy > https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln > abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw > bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst > afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx > bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz > emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr > https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz > https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx > https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu > cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu > http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx > https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst > https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv > cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz > bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw > kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy > https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw > dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy > gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux > http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy > https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq > beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr > http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz > http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx > http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy > fklnovy http://chkmosux.net/?chkmosux chkmosux > https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz > https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx > http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny > bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw > http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw > aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv > agiluv https://befimsu.biz/?befimsu befimsu > https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz > bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt > abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw > adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru > abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx > https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz > ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz > lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz > http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr > abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw > http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx > bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz > bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz > ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx > acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy > https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw > afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz > fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz > ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw > https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv > https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw > http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz > efjnoqrz https://achkqst.info/?achkqst achkqst > http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv > https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq > https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy > http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr > http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps > https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu > http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz > http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz > ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs > http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv > https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw > bfhjlsw https://iknortux.com/?iknortux iknortux > http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz > cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz > http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv > bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw > cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy > http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx > http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy > http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz > http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz > http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz > http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz > http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz > http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx > https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt > http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu > https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz > https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx > https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv > http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz > https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs > gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz > https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors > https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz > https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx > https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy > http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv > https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu > https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz > https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz > https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw > http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs > https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw > http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu > https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu > fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy > https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy > ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty > jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy > https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw > egijstw http://cdikopvz.info/?cdikopvz cdikopvz > https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw > fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu > efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv > https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy > bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu > aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy > behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp > http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz > http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz > http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz > http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors > http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux > http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz > http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr > http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv > https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz > http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw > https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy > https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty > https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu > http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv > https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy > https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy > http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst > http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw > https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy > fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy > cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz > elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou > dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw > ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx > gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw > http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty > https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz > http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv > http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz > http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy > bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry > fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq > http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw > https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz > http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw > aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy > aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy > http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou > https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx > http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu > https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv > http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz > https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs > bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw > https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx > https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu > https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz > https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw > ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz > einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz > adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw > ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy > bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx > dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz > http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz > http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx > https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy > https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz > cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy > https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy > gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy > kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx > cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy > https://dfiqwxy.biz/?dfiqwxy dfiqwxy http://dimprw.net/?dimprw dimprw > https://imqtwxyz.info/?imqtwxyz imqtwxyz http://bdhkmnst.ua/?bdhkmnst > bdhkmnst http://ipqsu.info/?ipqsu ipqsu http://fhijkoqr.net/?fhijkoqr > fhijkoqr https://acfiprs.ua/?acfiprs acfiprs > https://bfgjmnps.info/?bfgjmnps bfgjmnps https://afsuvz.net/?afsuvz > afsuvz http://bkltuwxy.net/?bkltuwxy bkltuwxy https://agilop.ru/?agilop > agilop http://adinqtuw.ru/?adinqtuw adinqtuw http://cemnptvz.biz/?cemnptvz > cemnptvz http://chjmoty.ua/?chjmoty chjmoty http://ajnrvx.info/?ajnrvx > ajnrvx http://aghlnoux.info/?aghlnoux aghlnoux http://alpvw.biz/?alpvw > alpvw http://bdklt.biz/?bdklt bdklt https://gmtwz.net/?gmtwz gmtwz > http://bfiktu.net/?bfiktu bfiktu http://cfilqyz.info/?cfilqyz cfilqyz > https://bfgkn.ru/?bfgkn bfgkn http://adglo.net/?adglo adglo > https://acfhmrz.ua/?acfhmrz acfhmrz http://cdhil.ua/?cdhil cdhil > https://afilorz.biz/?afilorz afilorz http://acqsvy.ru/?acqsvy acqsvy > http://begjst.info/?begjst begjst http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz > http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz > http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw > http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx > cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu > diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy > ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx > https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv > https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr > http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx > http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz > http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq > http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz > https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy > https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx > https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz > afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz > https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz > https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv > http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr > achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv > acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw > kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux > https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy > dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz > cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr > http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw > http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz > http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw > https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx > http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz > dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq > https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx > fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu > bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry > acejknry http://cehquvxy.net/?cehquvxy cehquvxy > https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx > http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm > https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz > dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw > egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv > jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw > eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz > bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy > https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv > https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy > http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr > http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk > https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory > https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz > https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz > https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx > chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz > http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz > http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy > http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz > http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy > http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw > https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox > https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw > https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy > http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu > bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv > https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv > http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz > http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv > https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx > http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs > https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv > http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy > http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw > http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy > http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz > https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy > bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw > bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy > bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu > cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru > cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw > efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps > http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy > abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky > https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv > http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw > https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv > https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx > https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz > http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz > http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz > https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy > http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx > http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz > bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv > adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt > bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw > bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz > gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv > dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz > denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz > drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu > fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz > mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy > dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz > https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz > http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry > https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz > http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw > https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw > https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy > gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw > https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy > bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw > https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv > https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz > jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz > aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw > http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv > https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy > https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu > https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy > https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln > abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw > bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst > afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx > bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz > emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr > https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz > https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx > https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu > cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu > http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx > https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst > https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv > cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz > bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw > kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy > https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw > dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy > gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux > http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy > https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq > beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr > http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz > http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx > http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy > fklnovy http://chkmosux.net/?chkmosux chkmosux > https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz > https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx > http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny > bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw > http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw > aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv > agiluv https://befimsu.biz/?befimsu befimsu > https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz > bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt > abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw > adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru > abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx > https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz > ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz > lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz > http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr > abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw > http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx > bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz > bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz > ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx > acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy > https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw > afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz > fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz > ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw > https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv > https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw > http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz > efjnoqrz https://achkqst.info/?achkqst achkqst > http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv > https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq > https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy > http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr > http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps > https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu > http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz > http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz > ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs > http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv > https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw > bfhjlsw https://iknortux.com/?iknortux iknortux > http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz > cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz > http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv > bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw > cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy > http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx > http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy > http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz > http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz > http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz > http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz > http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz > http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx > https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt > http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu > https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz > https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx > https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv > http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz > https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs > gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz > https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors > https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz > https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx > https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy > http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv > https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu > https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz > https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz > https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw > http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs > https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw > http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu > https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu > fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy > https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy > ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty > jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy > https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw > egijstw http://cdikopvz.info/?cdikopvz cdikopvz > https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw > fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu > efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv > https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy > bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu > aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy > behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp > http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz > http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz > http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz > http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors > http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux > http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz > http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr > http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv > https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz > http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw > https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy > https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty > https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu > http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv > https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy > https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy > http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst > http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw > https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy > fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy > cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz > elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou > dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw > ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx > gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw > http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty > https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz > http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv > http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz > http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy > bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry > fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq > http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw > https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz > http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw > aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy > aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy > http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou > https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx > http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu > https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv > http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz > https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs > bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw > https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx > https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu > https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz > https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw > ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz > einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz > adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw > ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy > bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx > dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz > http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz > http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx > https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy > https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz > cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy > https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy > gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy > kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx > cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy > https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw > abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz > ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu > https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz > http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns > https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp > https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw > http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz > http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw > http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry > bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy > bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz > ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx > https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx > http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx > http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz > https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox > cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy > acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz > fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty > aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu > http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox > https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz > http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy > gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux > afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx > bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy > aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv > fijqrv https://ahmquvw.info/?ahmquvw ahmquvw > https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty > https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy > ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx > agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz > hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx > http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy > https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz > http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq > https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor > https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz > acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu > dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz > hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy > https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv > http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr > http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu > http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv > https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv > http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr > http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv > https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx > http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq > https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv > https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx > ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz > bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx > https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty > https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw > http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx > https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz > https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv > http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz > http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz > https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw > http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs > https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx > https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz > https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn > https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy > http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov > https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv > https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu > abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy > https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno > aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw > aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz > https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx > https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu > https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux > http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw > https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps > https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns > https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz > https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx > http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx > https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz > https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz > http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv > acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu > cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw > ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn > http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz > http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv > https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz > http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux > egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz > bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv > https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs > http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu > http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz > http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx > https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw > http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo > https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz > http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv > deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw > bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy > https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky > https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy > ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw > https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz > dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy > bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw > bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx > abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln > fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory > hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx > https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt > https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv > https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx > bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy > dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy > gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy > http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw > https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw > https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty > http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw > http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz > http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw > ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu > hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns > adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv > http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu > https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz > https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx > https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv > http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp > cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz > adglpwz http://dehimosu.biz/?dehimosu dehimosu > https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory > http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy > http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy > https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz > http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv > http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv > http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy > http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz > https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru > http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy > http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy > http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux > https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz > dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx > http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy > https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv > https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux > https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv > http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz > flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx > https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs > http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw > http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns > https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu > http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz > https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw > https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw > https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw > aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv > abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw > flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw > defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy > http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz > http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs > http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy > binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv > cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv > cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw > http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw > http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw > https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv > hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx > https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp > http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz > https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv > http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz > http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv > http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz > http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx > https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt > https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz > https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw > https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy > https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy > adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy > defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx > cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz > behioqyz https://admrsvxz.com/?admrsvxz admrsvxz > https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx > http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst > http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz > adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs > chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy > gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx > bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv > bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx > adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx > http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz > http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz > http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno > http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz > https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy > http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz > https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw > https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz > http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz > https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty > https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr > https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz > https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw > http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru > http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy > http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz > http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy > https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny > http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx > http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow > https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw > http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz > https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu > http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy > http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy > dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor > bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz > bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv > begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz > https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx > https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz > http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy > bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu > https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx > http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy > https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry > https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst > https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy > https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu > ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo > http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz > http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw > https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps > http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu > https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz > http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv > http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx > https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu > https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst > demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu > achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq > adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz > cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux > egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz > ehopyz https://ehijptuy.com/?ehijptuy ehijptuy > http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw > http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz > https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw > http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv > https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy > eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx > acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs > http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz > http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz > http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv > https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx > efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino > adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy > http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz > http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy > https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv > https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw > https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy > https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw > bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy > https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz > cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz > https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz > https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz > https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru > https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz > https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz > https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy > abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw > https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx > http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks > https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv > http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu > https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu > http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy > http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu > http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu > fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz > http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy > adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw > ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux > hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm > adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy > dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv > http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx > http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz > http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw > https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv > https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy > http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz > https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy > dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz > https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz > cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz > http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz > chjpqvyz https://achijmr.com/?achijmr achijmr > https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv > http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw > https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux > https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy > dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz > cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr > http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw > http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz > http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw > https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx > http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz > dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq > https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx > fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu > bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry > acejknry http://cehquvxy.net/?cehquvxy cehquvxy > https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx > http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm > https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz > dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw > egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv > jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw > eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz > bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy > https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv > https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy > http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr > http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk > https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory > https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz > https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz > https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx > chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz > http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz > http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy > http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz > http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy > http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw > https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox > https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw > https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy > http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu > bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv > https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv > http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz > http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv > https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx > http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs > https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv > http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy > http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw > http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy > http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz > https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy > bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw > bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy > bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu > cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru > cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw > efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps > http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy > abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky > https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv > http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw > https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv > https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx > https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz > http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz > http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz > https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy > http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx > http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz > bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv > adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt > bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw > bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz > gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv > dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz > denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz > drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu > fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz > mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy > dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz > https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz > http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry > https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz > http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw > https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw > https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy > gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw > https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy > bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw > https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv > https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz > jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz > aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw > http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv > https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy > https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu > https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy > https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln > abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw > bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst > afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx > bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz > emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr > https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz > https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx > https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu > cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu > http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx > https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst > https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv > cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz > bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw > kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy > https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw > dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy > gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux > http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy > https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq > beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr > http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz > http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx > http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy > fklnovy http://chkmosux.net/?chkmosux chkmosux > https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz > https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx > http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny > bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw > http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw > aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv > agiluv https://befimsu.biz/?befimsu befimsu > https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz > bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt > abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw > adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru > abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx > https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz > ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz > lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz > http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr > abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw > http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx > bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz > bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz > ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx > acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy > https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw > afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz > fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz > ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw > https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv > https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw > http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz > efjnoqrz https://achkqst.info/?achkqst achkqst > http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv > https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq > https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy > http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr > http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps > https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu > http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz > http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz > ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs > http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv > https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw > bfhjlsw https://iknortux.com/?iknortux iknortux > http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz > cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz > http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv > bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw > cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy > http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx > http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy > http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz > http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz > http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz > http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz > http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz > http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx > https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt > http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu > https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz > https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx > https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv > http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz > https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs > gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz > https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors > https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz > https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx > https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy > http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv > https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu > https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz > https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz > https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw > http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs > https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw > http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu > https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu > fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy > https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy > ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty > jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy > https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw > egijstw http://cdikopvz.info/?cdikopvz cdikopvz > https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw > fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu > efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv > https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy > bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu > aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy > behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp > http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz > http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz > http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz > http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors > http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux > http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz > http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr > http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv > https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz > http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw > https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy > https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty > https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu > http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv > https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy > https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy > http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst > http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw > https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy > fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy > cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz > elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou > dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw > ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx > gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw > http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty > https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz > http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv > http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz > http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy > bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry > fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq > http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw > https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz > http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw > aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy > aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy > http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou > https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx > http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu > https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv > http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz > https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs > bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw > https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx > https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu > https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz > https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw > ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz > einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz > adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw > ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy > bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx > dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz > http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz > http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx > https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy > https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz > cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy > https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy > gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy > kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx > cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy > Hey gorgeous. I?ve seen, met u in caffe last weekend but I could not come > . You?re sex. My pictures, photos are on that website > > -- Lisandro Dalcin ============ Research Scientist Computer, Electrical and Mathematical Sciences & Engineering (CEMSE) Extreme Computing Research Center (ECRC) King Abdullah University of Science and Technology (KAUST) http://ecrc.kaust.edu.sa/ 4700 King Abdullah University of Science and Technology al-Khawarizmi Bldg (Bldg 1), Office # 0109 Thuwal 23955-6900, Kingdom of Saudi Arabia http://www.kaust.edu.sa Office Phone: +966 12 808-0459 -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Nov 21 09:59:58 2018 From: balay at mcs.anl.gov (Balay, Satish) Date: Wed, 21 Nov 2018 15:59:58 +0000 Subject: [petsc-users] SPAM In-Reply-To: References: <1DE7C132C1153B98444108B591301B70@brasserieardennaise.be> Message-ID: On Wed, 21 Nov 2018, Lisandro Dalcin via petsc-users wrote: > Dear Nina, > > Although we are supposed to be empathetic with you, but I'm afraid you are > violating our Code of Conduct. > > Warm regards, It appears such spam has increased in the recent past. I've reported this to our admins - but some spam somehow sneaks in through their filters. We have open mailing lists - to avoid 'subscribe-before-post' and then 'remain on the list for un-needed traffic' burden to our valid users The cost for this open list is such ocassional spam. I try updating mailing-list side filters after the fact to avoid repeats - but that prevents repeats only from the same address. Satish From fdkong.jd at gmail.com Wed Nov 21 11:14:53 2018 From: fdkong.jd at gmail.com (Fande Kong) Date: Wed, 21 Nov 2018 10:14:53 -0700 Subject: [petsc-users] ParMETIS produces different partitions with 32-bit and 64-bit integers respectively Message-ID: Hi Developers, I was wondering if it is normal that I will have a different partition when switching from 32-bit integers to 64-bit integers ( --with-64-bit-indices)? ? And this happens only when we apply weights (either edge or vertex weights). An example is attached. Thanks, Fande -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 32-bit-partition.jpg Type: image/jpeg Size: 166489 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 64-bit-partition.jpg Type: image/jpeg Size: 166775 bytes Desc: not available URL: From knepley at gmail.com Wed Nov 21 14:57:08 2018 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 21 Nov 2018 15:57:08 -0500 Subject: [petsc-users] ParMETIS produces different partitions with 32-bit and 64-bit integers respectively In-Reply-To: References: Message-ID: On Wed, Nov 21, 2018 at 12:12 PM Fande Kong via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hi Developers, > > I was wondering if it is normal that I will have a different partition > when switching from 32-bit integers to 64-bit integers > ( --with-64-bit-indices)? ? > > And this happens only when we apply weights (either edge or vertex > weights). > > An example is attached. > ParMetis has randomization. It gives different partitions even on different architectures. This does not surprise me. Matt > Thanks, > > Fande > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jiaoyan.Li at inl.gov Wed Nov 21 17:33:16 2018 From: Jiaoyan.Li at inl.gov (Jiaoyan Li) Date: Wed, 21 Nov 2018 23:33:16 +0000 Subject: [petsc-users] PetscPartitioner in Fortran Message-ID: <55DCC74D-D6BD-4AB2-BD43-B7045F38802E@inl.gov> Dear Petsc users: I am trying to use Petsc APIs for Fortran. One problem that I am facing right now is about the PetscPartitioner, i.e., #include ?petsc/finclude/petsc.h? use petscdmplex PetscPartitioner :: part PetscErrorCode :: ierr Call PetscPartitionerCreate(PETSC_COMM_WORLD, part, ierr) But, I got the error message as follows: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Null argument, when expecting valid pointer [0]PETSC ERROR: Null Pointer: Parameter # 2 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown [0]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 [0]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf [0]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Null argument, when expecting valid pointer [1]PETSC ERROR: Null Pointer: Parameter # 2 [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown [1]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 [1]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf [1]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [2]PETSC ERROR: Null argument, when expecting valid pointer [2]PETSC ERROR: Null Pointer: Parameter # 2 [2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [2]PETSC ERROR: Petsc Release Version 3.10.2, unknown [2]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 [2]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf [2]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [3]PETSC ERROR: Null argument, when expecting valid pointer [3]PETSC ERROR: Null Pointer: Parameter # 2 [3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [3]PETSC ERROR: Petsc Release Version 3.10.2, unknown [3]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 [3]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf [3]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c Is there anyone who may encounter similar problem before? Any suggestions or commends are highly appreciated. Thank you very much. Best, Jiaoyan -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Nov 21 20:25:23 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Thu, 22 Nov 2018 02:25:23 +0000 Subject: [petsc-users] PetscPartitioner in Fortran In-Reply-To: <55DCC74D-D6BD-4AB2-BD43-B7045F38802E@inl.gov> References: <55DCC74D-D6BD-4AB2-BD43-B7045F38802E@inl.gov> Message-ID: <0D53DB88-30BC-40BC-A1D4-71E6D95ED7FA@anl.gov> Matt, PetscPartitioner is missing from lib/petsc/conf/bfort-petsc.txt Barry > On Nov 21, 2018, at 3:33 PM, Jiaoyan Li via petsc-users wrote: > > Dear Petsc users: > > I am trying to use Petsc APIs for Fortran. One problem that I am facing right now is about the PetscPartitioner, i.e., > > #include ?petsc/finclude/petsc.h? > use petscdmplex > > PetscPartitioner :: part > PetscErrorCode :: ierr > > Call PetscPartitionerCreate(PETSC_COMM_WORLD, part, ierr) > > But, I got the error message as follows: > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Null argument, when expecting valid pointer > [0]PETSC ERROR: Null Pointer: Parameter # 2 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [0]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [0]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Null argument, when expecting valid pointer > [1]PETSC ERROR: Null Pointer: Parameter # 2 > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [1]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [1]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [1]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [2]PETSC ERROR: Null argument, when expecting valid pointer > [2]PETSC ERROR: Null Pointer: Parameter # 2 > [2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [2]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [2]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [2]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [2]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [3]PETSC ERROR: Null argument, when expecting valid pointer > [3]PETSC ERROR: Null Pointer: Parameter # 2 > [3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [3]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [3]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [3]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [3]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > Is there anyone who may encounter similar problem before? Any suggestions or commends are highly appreciated. Thank you very much. > > Best, > > Jiaoyan From Jiaoyan.Li at inl.gov Wed Nov 21 20:27:26 2018 From: Jiaoyan.Li at inl.gov (Jiaoyan Li) Date: Thu, 22 Nov 2018 02:27:26 +0000 Subject: [petsc-users] PetscPartitioner in Fortran In-Reply-To: <0D53DB88-30BC-40BC-A1D4-71E6D95ED7FA@anl.gov> References: <55DCC74D-D6BD-4AB2-BD43-B7045F38802E@inl.gov> <0D53DB88-30BC-40BC-A1D4-71E6D95ED7FA@anl.gov> Message-ID: <6AB58BA1-A12C-41C7-9FEB-C27891AB8A3B@inl.gov> Barry, Thanks a lot for your reply. Look forward to the further information. Jiaoyan ?On 11/21/18, 19:25, "Smith, Barry F." wrote: Matt, PetscPartitioner is missing from lib/petsc/conf/bfort-petsc.txt Barry > On Nov 21, 2018, at 3:33 PM, Jiaoyan Li via petsc-users wrote: > > Dear Petsc users: > > I am trying to use Petsc APIs for Fortran. One problem that I am facing right now is about the PetscPartitioner, i.e., > > #include ?petsc/finclude/petsc.h? > use petscdmplex > > PetscPartitioner :: part > PetscErrorCode :: ierr > > Call PetscPartitionerCreate(PETSC_COMM_WORLD, part, ierr) > > But, I got the error message as follows: > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Null argument, when expecting valid pointer > [0]PETSC ERROR: Null Pointer: Parameter # 2 > [0]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [0]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [0]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Null argument, when expecting valid pointer > [1]PETSC ERROR: Null Pointer: Parameter # 2 > [1]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [1]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [1]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [1]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [2]PETSC ERROR: Null argument, when expecting valid pointer > [2]PETSC ERROR: Null Pointer: Parameter # 2 > [2]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [2]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [2]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [2]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [2]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [3]PETSC ERROR: Null argument, when expecting valid pointer > [3]PETSC ERROR: Null Pointer: Parameter # 2 > [3]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [3]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [3]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [3]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [3]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > Is there anyone who may encounter similar problem before? Any suggestions or commends are highly appreciated. Thank you very much. > > Best, > > Jiaoyan From alia at wias-berlin.de Thu Nov 22 02:41:37 2018 From: alia at wias-berlin.de (Najib Alia) Date: Thu, 22 Nov 2018 09:41:37 +0100 Subject: [petsc-users] Configure Mumps with 64-bit integers Message-ID: Hello, I am getting an "INFOG(1) = -51 integer overflow" error from Mumps because the linear system I am solving is very big. I have tried to configure Mumps in a full 64-bit integer version using =============================================== ./configure --with-64-bit-indices --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --with-debugging=0 COPTFLAGS='-O3 -march=native -mtune=native' CXXOPTFLAGS='-O3 -march=native -mtune=native' FOPTFLAGS='-O3 -march=native -mtune=native' --download-fblaslapack --download-openmpi --download-scalapack --download-mumps --download-metis --download-suitesparse --download-parmetis =============================================== but I obtain the following error: ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS??? (see configure.log for details): ------------------------------------------------------------------------------- Cannot use MUMPS with 64 bit integers, it is not coded for this capability ******************************************************************************* Is there a way to use Mumps with 64-bit integers through Petsc? or to use the "selective 64-bit integer feature"? Thank you ahead. Alia From knepley at gmail.com Thu Nov 22 07:19:50 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 22 Nov 2018 08:19:50 -0500 Subject: [petsc-users] Configure Mumps with 64-bit integers In-Reply-To: References: Message-ID: This should be sent to the MUMPS list. So far we did not see a way to tell them about the integer type. They might say they rely on Fortran to do it, but we can't since that is unreliable from configure. Thanks, Matt On Thu, Nov 22, 2018 at 3:41 AM Najib Alia via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hello, > > I am getting an "INFOG(1) = -51 integer overflow" error from Mumps > because the linear system I am solving is very big. I have tried to > configure Mumps in a full 64-bit integer version using > > =============================================== > > ./configure --with-64-bit-indices --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --with-debugging=0 COPTFLAGS='-O3 -march=native > -mtune=native' CXXOPTFLAGS='-O3 -march=native -mtune=native' > FOPTFLAGS='-O3 -march=native -mtune=native' --download-fblaslapack > --download-openmpi --download-scalapack --download-mumps > --download-metis --download-suitesparse --download-parmetis > > =============================================== > > but I obtain the following error: > > > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > ------------------------------------------------------------------------------- > Cannot use MUMPS with 64 bit integers, it is not coded for this capability > > ******************************************************************************* > > Is there a way to use Mumps with 64-bit integers through Petsc? or to > use the "selective 64-bit integer feature"? > > Thank you ahead. > > Alia > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From barrydog505 at gmail.com Thu Nov 22 10:22:03 2018 From: barrydog505 at gmail.com (barry) Date: Fri, 23 Nov 2018 00:22:03 +0800 Subject: [petsc-users] error messages while using snes Message-ID: Hi, I want to solve a nonlinear equation tan(x) = x, and the code i wrote occur some error. In the error, it say that I need to use MatAssemblyBegin/End(), but there aren't any Mat in my program. How can I fix it? This is what i write (major part): int main(){ ? ..... ? VecCreate(PETSC_COMM_WORLD,&f); ? VecSetSizes(f,PETSC_DECIDE,400); ? VecSetFromOptions(f); ? VecDuplicate(f,&b); ? SNESCreate(PETSC_COMM_WORLD,&snes); ? SNESSetFunction(snes,f,FormFunction,&user); ? SNESSetComputeInitialGuess(snes,FormInitialGuess,NULL); ? SNESSetFromOptions(snes); ? SNESSolve(snes,NULL,b); ? SNESDestroy(&snes); ? ..... } PetscErrorCode FormFunction(SNES snes,Vec b,Vec f,void *ctx) { ? const PetscScalar?? *bb; ? PetscScalar???????? ??? *ff; ? PetscInt??????????? ??? ?? i; ? PetscErrorCode?????? ierr; ? PetscInt??? ??? ??? ?????? A1 = 1; ? PetscInt??? ??? ??? ?????? A2 = 1; ? VecGetArrayRead(b,&bb); ? VecGetArray(f,&ff); ? /* Compute function */ ? for (i=0; i<400; i++) ff[i] = tan(bb[i])-(A1/A2)*bb[i]; ? VecRestoreArrayRead(b,&bb); ? VecRestoreArray(f,&ff); ? return 0; } PetscErrorCode FormInitialGuess(SNES snes,Vec b,void *ctx) { ? PetscScalar?????? ??? *bb; ? PetscInt????????? ??? ??? i; ? PetscErrorCode??? ? ierr; ? VecGetArray(b,&bb); ? /* Compute initial guess */ ? for (i=2; i<400; i++) ? { ??? if????? (i==0) bb[i] = 0; ??? else if (i==1) bb[i] = 4.5; ??? else?????????? bb[i] = bb[i-1] + 3.2; ? } ? VecRestoreArray(b,&bb); ? return(0); } Here is the error come out: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Matrix must be assembled by calls to MatAssemblyBegin/End(); [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown [0]PETSC ERROR: ./test2 on a arch-linux2-c-debug named G1ngy by barry Thu Nov 22 21:07:18 2018 [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack [0]PETSC ERROR: #1 MatFDColoringCreate() line 469 in /home/barry/petsc/src/mat/matfd/fdmatrix.c [0]PETSC ERROR: #2 SNESComputeJacobianDefaultColor() line 84 in /home/barry/petsc/src/snes/interface/snesj2.c [0]PETSC ERROR: #3 SNESComputeJacobian() line 2555 in /home/barry/petsc/src/snes/interface/snes.c [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 222 in /home/barry/petsc/src/snes/impls/ls/ls.c [0]PETSC ERROR: #5 SNESSolve() line 4396 in /home/barry/petsc/src/snes/interface/snes.c [0]PETSC ERROR: #6 main() line 89 in /home/barry/Desktop/petsc/test2.c [0]PETSC ERROR: No PETSc Option Table entries [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=73 : system msg for write_line failure : Bad file descriptor Thank you, Barry From andreaceresoli91 at gmail.com Thu Nov 22 12:26:58 2018 From: andreaceresoli91 at gmail.com (Andrea Ceresoli) Date: Thu, 22 Nov 2018 19:26:58 +0100 Subject: [petsc-users] issue with DMPlexCreateFromCellListParallel Message-ID: Good evening, I'm trying to make a parallel assembling of two networks in PETSc using power2.c as a template. Instead of reading data only from rank0 process, I want 2 processes (rank0 and rank1) to read their own data (each process reads case9.m). As far as my understanding to make this work I modified the function DMNetworkLayoutSetUp when the function DMPlexCreateFromCellList is called and replaced it with DMPlexCreateFromCellListParallel when the size is greater than 1. Running with this new setup power2.c gives: [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Invalid argument [1]PETSC ERROR: Global vertex 9 on rank 1 was unclaimed [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown [1]PETSC ERROR: ./prova2 on a arch-linux2-c-debug named SGI-W-02 by sgilab Thu Nov 22 19:11:40 2018 [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack --download-mpich [1]PETSC ERROR: #1 DMPlexBuildFromCellList_Parallel_Internal() line 2632 in /home/sgilab/petsc/src/dm/impls/plex/plexcreate.c [1]PETSC ERROR: #2 DMPlexCreateFromCellListParallel() line 2776 in /home/sgilab/petsc/src/dm/impls/plex/plexcreate.c [1]PETSC ERROR: #3 DMNetworkLayoutSetUp_Parallel() line 328 in /home/sgilab/petsc/src/snes/examples/tutorials/network/power2/prova2.c [1]PETSC ERROR: #4 main() line 476 in /home/sgilab/petsc/src/snes/examples/tutorials/network/power2/prova2.c [1]PETSC ERROR: PETSc Option Table entries: [1]PETSC ERROR: -ksp_type gmres [1]PETSC ERROR: -pc_type bjacobi [1]PETSC ERROR: -snes_atol 1e-8 [1]PETSC ERROR: -snes_converged_reason [1]PETSC ERROR: -snes_linesearch_type basic [1]PETSC ERROR: -snes_rtol 1e-20 [1]PETSC ERROR: -snes_type newtonls [1]PETSC ERROR: -sub_pc_factor_mat_ordering_type qmd [1]PETSC ERROR: -sub_pc_type lu [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- I was wondering whether I am using DMPlexCreateFromCellListParallel in the right context and for the right purpose. Do you have any suggestions? What do you think? Thank you in advance, Andrea -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 22 13:09:52 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 22 Nov 2018 14:09:52 -0500 Subject: [petsc-users] error messages while using snes In-Reply-To: References: Message-ID: On Thu, Nov 22, 2018 at 11:23 AM barry via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hi, > > I want to solve a nonlinear equation tan(x) = x, and the code i wrote > occur some error. > > In the error, it say that I need to use MatAssemblyBegin/End(), but > there aren't any Mat in my program. > > How can I fix it? > You could a) Use a method without a Jacobian, like nonlinear Richardson b) Use Newton with a finite difference Jacobian, -snes_type fd, but this is not scalable c) Provide a Jacobian for Newton Thanks, Matt > This is what i write (major part): > > int main(){ > > ..... > > VecCreate(PETSC_COMM_WORLD,&f); > VecSetSizes(f,PETSC_DECIDE,400); > VecSetFromOptions(f); > VecDuplicate(f,&b); > > SNESCreate(PETSC_COMM_WORLD,&snes); > SNESSetFunction(snes,f,FormFunction,&user); > SNESSetComputeInitialGuess(snes,FormInitialGuess,NULL); > SNESSetFromOptions(snes); > SNESSolve(snes,NULL,b); > > SNESDestroy(&snes); > > ..... > > } > > PetscErrorCode FormFunction(SNES snes,Vec b,Vec f,void *ctx) > { > const PetscScalar *bb; > PetscScalar *ff; > PetscInt i; > PetscErrorCode ierr; > > PetscInt A1 = 1; > PetscInt A2 = 1; > > VecGetArrayRead(b,&bb); > VecGetArray(f,&ff); > > /* Compute function */ > for (i=0; i<400; i++) ff[i] = tan(bb[i])-(A1/A2)*bb[i]; > > VecRestoreArrayRead(b,&bb); > VecRestoreArray(f,&ff); > return 0; > } > > PetscErrorCode FormInitialGuess(SNES snes,Vec b,void *ctx) > { > PetscScalar *bb; > PetscInt i; > PetscErrorCode ierr; > > VecGetArray(b,&bb); > > /* Compute initial guess */ > for (i=2; i<400; i++) > { > if (i==0) bb[i] = 0; > else if (i==1) bb[i] = 4.5; > else bb[i] = bb[i-1] + 3.2; > } > > VecRestoreArray(b,&bb); > return(0); > } > > > Here is the error come out: > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Matrix must be assembled by calls to > MatAssemblyBegin/End(); > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./test2 on a arch-linux2-c-debug named G1ngy by barry > Thu Nov 22 21:07:18 2018 > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-mpich --download-fblaslapack > [0]PETSC ERROR: #1 MatFDColoringCreate() line 469 in > /home/barry/petsc/src/mat/matfd/fdmatrix.c > [0]PETSC ERROR: #2 SNESComputeJacobianDefaultColor() line 84 in > /home/barry/petsc/src/snes/interface/snesj2.c > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2555 in > /home/barry/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 222 in > /home/barry/petsc/src/snes/impls/ls/ls.c > [0]PETSC ERROR: #5 SNESSolve() line 4396 in > /home/barry/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: #6 main() line 89 in /home/barry/Desktop/petsc/test2.c > [0]PETSC ERROR: No PETSc Option Table entries > [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=73 > : > system msg for write_line failure : Bad file descriptor > > > Thank you, > > Barry > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Nov 22 13:19:37 2018 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 22 Nov 2018 14:19:37 -0500 Subject: [petsc-users] error messages while using snes In-Reply-To: References: Message-ID: Note, your FormInitialGuess does not initialize bb[0] and bb[1]. On Thu, Nov 22, 2018 at 11:23 AM barry via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hi, > > I want to solve a nonlinear equation tan(x) = x, and the code i wrote > occur some error. > > In the error, it say that I need to use MatAssemblyBegin/End(), but > there aren't any Mat in my program. > > How can I fix it? > > This is what i write (major part): > > int main(){ > > ..... > > VecCreate(PETSC_COMM_WORLD,&f); > VecSetSizes(f,PETSC_DECIDE,400); > VecSetFromOptions(f); > VecDuplicate(f,&b); > > SNESCreate(PETSC_COMM_WORLD,&snes); > SNESSetFunction(snes,f,FormFunction,&user); > SNESSetComputeInitialGuess(snes,FormInitialGuess,NULL); > SNESSetFromOptions(snes); > SNESSolve(snes,NULL,b); > > SNESDestroy(&snes); > > ..... > > } > > PetscErrorCode FormFunction(SNES snes,Vec b,Vec f,void *ctx) > { > const PetscScalar *bb; > PetscScalar *ff; > PetscInt i; > PetscErrorCode ierr; > > PetscInt A1 = 1; > PetscInt A2 = 1; > > VecGetArrayRead(b,&bb); > VecGetArray(f,&ff); > > /* Compute function */ > for (i=0; i<400; i++) ff[i] = tan(bb[i])-(A1/A2)*bb[i]; > > VecRestoreArrayRead(b,&bb); > VecRestoreArray(f,&ff); > return 0; > } > > PetscErrorCode FormInitialGuess(SNES snes,Vec b,void *ctx) > { > PetscScalar *bb; > PetscInt i; > PetscErrorCode ierr; > > VecGetArray(b,&bb); > > /* Compute initial guess */ > for (i=2; i<400; i++) > { > if (i==0) bb[i] = 0; > else if (i==1) bb[i] = 4.5; > else bb[i] = bb[i-1] + 3.2; > } > > VecRestoreArray(b,&bb); > return(0); > } > > > Here is the error come out: > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Matrix must be assembled by calls to > MatAssemblyBegin/End(); > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./test2 on a arch-linux2-c-debug named G1ngy by barry > Thu Nov 22 21:07:18 2018 > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-mpich --download-fblaslapack > [0]PETSC ERROR: #1 MatFDColoringCreate() line 469 in > /home/barry/petsc/src/mat/matfd/fdmatrix.c > [0]PETSC ERROR: #2 SNESComputeJacobianDefaultColor() line 84 in > /home/barry/petsc/src/snes/interface/snesj2.c > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2555 in > /home/barry/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 222 in > /home/barry/petsc/src/snes/impls/ls/ls.c > [0]PETSC ERROR: #5 SNESSolve() line 4396 in > /home/barry/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: #6 main() line 89 in /home/barry/Desktop/petsc/test2.c > [0]PETSC ERROR: No PETSc Option Table entries > [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=73 > : > system msg for write_line failure : Bad file descriptor > > > Thank you, > > Barry > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 22 13:19:16 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 22 Nov 2018 14:19:16 -0500 Subject: [petsc-users] issue with DMPlexCreateFromCellListParallel In-Reply-To: References: Message-ID: On Thu, Nov 22, 2018 at 1:28 PM Andrea Ceresoli via petsc-users < petsc-users at mcs.anl.gov> wrote: > Good evening, > > I'm trying to make a parallel assembling of two networks in PETSc using > power2.c as a template. > Instead of reading data only from rank0 process, I want 2 processes (rank0 > and rank1) to read their own data (each process reads case9.m). As far as > my understanding to make this work I modified the function > DMNetworkLayoutSetUp when the function DMPlexCreateFromCellList is called > and replaced it with DMPlexCreateFromCellListParallel when the size is > greater than 1. > > Running with this new setup power2.c gives: > > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [1]PETSC ERROR: Invalid argument > [1]PETSC ERROR: Global vertex 9 on rank 1 was unclaimed > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [1]PETSC ERROR: ./prova2 on a arch-linux2-c-debug named SGI-W-02 by sgilab > Thu Nov 22 19:11:40 2018 > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=0 --with-fc=0 > --download-f2cblaslapack --download-mpich > [1]PETSC ERROR: #1 DMPlexBuildFromCellList_Parallel_Internal() line 2632 > in /home/sgilab/petsc/src/dm/impls/plex/plexcreate.c > [1]PETSC ERROR: #2 DMPlexCreateFromCellListParallel() line 2776 in > /home/sgilab/petsc/src/dm/impls/plex/plexcreate.c > [1]PETSC ERROR: #3 DMNetworkLayoutSetUp_Parallel() line 328 in > /home/sgilab/petsc/src/snes/examples/tutorials/network/power2/prova2.c > [1]PETSC ERROR: #4 main() line 476 in > /home/sgilab/petsc/src/snes/examples/tutorials/network/power2/prova2.c > [1]PETSC ERROR: PETSc Option Table entries: > [1]PETSC ERROR: -ksp_type gmres > [1]PETSC ERROR: -pc_type bjacobi > [1]PETSC ERROR: -snes_atol 1e-8 > [1]PETSC ERROR: -snes_converged_reason > [1]PETSC ERROR: -snes_linesearch_type basic > [1]PETSC ERROR: -snes_rtol 1e-20 > [1]PETSC ERROR: -snes_type newtonls > [1]PETSC ERROR: -sub_pc_factor_mat_ordering_type qmd > [1]PETSC ERROR: -sub_pc_type lu > [1]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > > I was wondering whether I am using DMPlexCreateFromCellListParallel in the > right context and for the right purpose. > > Do you have any suggestions? What do you think? > It looks like your input is wrong. Please have a look at src/dm/impls/plex/examples/tests/ex18.c It has all our tests for parallel loading. Thanks, Matt > Thank you in advance, > > Andrea > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From k_burkart at yahoo.com Fri Nov 23 08:44:19 2018 From: k_burkart at yahoo.com (Klaus Burkart) Date: Fri, 23 Nov 2018 14:44:19 +0000 (UTC) Subject: [petsc-users] Incorrect local row ranges allocated to the processes i.e. rstart and rend are not what I expected References: <1721296158.1665556.1542984259829.ref@mail.yahoo.com> Message-ID: <1721296158.1665556.1542984259829@mail.yahoo.com> Hello, I am trying to compute the local row ranges allocated to the processes i.e. rstart and rend of each process, needed as a? prerequisite for MatMPIAIJSetPreallocation using d_nnz and o_nnz. I tried the following: ... ??? PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); ??? MPI_Comm_size(PETSC_COMM_WORLD,&size); ??? MPI_Comm_rank(PETSC_COMM_WORLD,&rank); ??? MatCreate(PETSC_COMM_WORLD,&A); ??? MatSetType(A,MATMPIAIJ); ??? PetscInt local_size = PETSC_DECIDE; ??? PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); ??? MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); ??? rstart = rend - local_size; ??? PetscInt d_nnz[local_size], o_nnz[local_size]; /* compute d_nnz and o_nnz here ??? MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); */ ??? for (rank = 0; rank < size; rank++) { ??? PetscPrintf(PETSC_COMM_WORLD,"local_size?? =? %d, on process %d\n", local_size, rank); ??? PetscPrintf(PETSC_COMM_WORLD,"rstart =? %d, on process %d\n", rstart, rank); ??? PetscPrintf(PETSC_COMM_WORLD,"rend =? %d, on process %d\n", rend, rank); ??? } ??? PetscFinalize(); The local size is 25 rows on each process but rstart and rend are 0 and 25 on all processes, I expected 0 and 25, 25 and 50, 50 and 75 and 75 and 101. N = 100 I can't spot the error. Any ideas, what's the problem? Klaus -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 23 10:55:12 2018 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 23 Nov 2018 11:55:12 -0500 Subject: [petsc-users] Incorrect local row ranges allocated to the processes i.e. rstart and rend are not what I expected In-Reply-To: <1721296158.1665556.1542984259829@mail.yahoo.com> References: <1721296158.1665556.1542984259829.ref@mail.yahoo.com> <1721296158.1665556.1542984259829@mail.yahoo.com> Message-ID: On Fri, Nov 23, 2018 at 9:54 AM Klaus Burkart via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hello, > > I am trying to compute the local row ranges allocated to the processes > i.e. rstart and rend of each process, needed as a prerequisite for > MatMPIAIJSetPreallocation using d_nnz and o_nnz. > > I tried the following: > > ... > > PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); > > MPI_Comm_size(PETSC_COMM_WORLD,&size); > MPI_Comm_rank(PETSC_COMM_WORLD,&rank); > > MatCreate(PETSC_COMM_WORLD,&A); > MatSetType(A,MATMPIAIJ); > PetscInt local_size = PETSC_DECIDE; > PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); > MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); > This looks right to me. Not sure what your problem is. However, you can always use PetscLayout to do this automatically. Thanks, Matt > rstart = rend - local_size; > PetscInt d_nnz[local_size], o_nnz[local_size]; > /* > > compute d_nnz and o_nnz here > > MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); > */ > > for (rank = 0; rank < size; rank++) { > PetscPrintf(PETSC_COMM_WORLD,"local_size = %d, on process %d\n", > local_size, rank); > PetscPrintf(PETSC_COMM_WORLD,"rstart = %d, on process %d\n", rstart, > rank); > PetscPrintf(PETSC_COMM_WORLD,"rend = %d, on process %d\n", rend, > rank); > } > > PetscFinalize(); > > The local size is 25 rows on each process but rstart and rend are 0 and 25 > on all processes, I expected 0 and 25, 25 and 50, 50 and 75 and 75 and 101. > N = 100 > > I can't spot the error. Any ideas, what's the problem? > > Klaus > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zakaryah at gmail.com Fri Nov 23 11:02:31 2018 From: zakaryah at gmail.com (zakaryah) Date: Fri, 23 Nov 2018 12:02:31 -0500 Subject: [petsc-users] Incorrect local row ranges allocated to the processes i.e. rstart and rend are not what I expected In-Reply-To: References: <1721296158.1665556.1542984259829.ref@mail.yahoo.com> <1721296158.1665556.1542984259829@mail.yahoo.com> Message-ID: What does your output look like? With PETSC_COMM_WORLD, I think you will only get the output from the rank 0 process. Try with PETSC_COMM_SELF. On Fri, Nov 23, 2018, 11:56 AM Matthew Knepley via petsc-users < petsc-users at mcs.anl.gov wrote: > On Fri, Nov 23, 2018 at 9:54 AM Klaus Burkart via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Hello, >> >> I am trying to compute the local row ranges allocated to the processes >> i.e. rstart and rend of each process, needed as a prerequisite for >> MatMPIAIJSetPreallocation using d_nnz and o_nnz. >> >> I tried the following: >> >> ... >> >> PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); >> >> MPI_Comm_size(PETSC_COMM_WORLD,&size); >> MPI_Comm_rank(PETSC_COMM_WORLD,&rank); >> >> MatCreate(PETSC_COMM_WORLD,&A); >> MatSetType(A,MATMPIAIJ); >> PetscInt local_size = PETSC_DECIDE; >> PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); >> MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); >> > > This looks right to me. Not sure what your problem is. However, you can > always use PetscLayout to do > this automatically. > > Thanks, > > Matt > > >> rstart = rend - local_size; >> PetscInt d_nnz[local_size], o_nnz[local_size]; >> /* >> >> compute d_nnz and o_nnz here >> >> MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); >> */ >> >> for (rank = 0; rank < size; rank++) { >> PetscPrintf(PETSC_COMM_WORLD,"local_size = %d, on process %d\n", >> local_size, rank); >> PetscPrintf(PETSC_COMM_WORLD,"rstart = %d, on process %d\n", rstart, >> rank); >> PetscPrintf(PETSC_COMM_WORLD,"rend = %d, on process %d\n", rend, >> rank); >> } >> >> PetscFinalize(); >> >> The local size is 25 rows on each process but rstart and rend are 0 and >> 25 on all processes, I expected 0 and 25, 25 and 50, 50 and 75 and 75 and >> 101. N = 100 >> >> I can't spot the error. Any ideas, what's the problem? >> >> Klaus >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From k_burkart at yahoo.com Fri Nov 23 12:04:47 2018 From: k_burkart at yahoo.com (Klaus Burkart) Date: Fri, 23 Nov 2018 18:04:47 +0000 (UTC) Subject: [petsc-users] Incorrect local row ranges allocated to the processes i.e. rstart and rend are not what I expected In-Reply-To: References: <1721296158.1665556.1542984259829.ref@mail.yahoo.com> <1721296158.1665556.1542984259829@mail.yahoo.com> Message-ID: <1384144592.1738166.1542996287805@mail.yahoo.com> The output is: local_size?? =? 25, on process 0 rstart =? 0, on process 0 rend =? 25, on process 0 local_size?? =? 25, on process 1 rstart =? 0, on process 1 rend =? 25, on process 1 local_size?? =? 25, on process 2 rstart =? 0, on process 2 rend =? 25, on process 2 local_size?? =? 25, on process 3 rstart =? 0, on process 3 rend =? 25, on process 3 Using PETSC_COMM_SELF has no effect. Am Freitag, 23. November 2018, 18:02:45 MEZ hat zakaryah Folgendes geschrieben: What does your output look like?? With PETSC_COMM_WORLD, I think you will only get the output from the rank 0 process.? Try with PETSC_COMM_SELF. On Fri, Nov 23, 2018, 11:56 AM Matthew Knepley via petsc-users wrote: Hello, I am trying to compute the local row ranges allocated to the processes i.e. rstart and rend of each process, needed as a? prerequisite for MatMPIAIJSetPreallocation using d_nnz and o_nnz. I tried the following: ... ??? PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); ??? MPI_Comm_size(PETSC_COMM_WORLD,&size); ??? MPI_Comm_rank(PETSC_COMM_WORLD,&rank); ??? MatCreate(PETSC_COMM_WORLD,&A); ??? MatSetType(A,MATMPIAIJ); ??? PetscInt local_size = PETSC_DECIDE; ??? PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); ??? MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); This looks right to me. Not sure what your problem is. However, you can always use PetscLayout to dothis automatically. ? Thanks, ? ? ?Matt? ??? rstart = rend - local_size; ??? PetscInt d_nnz[local_size], o_nnz[local_size]; /* compute d_nnz and o_nnz here ??? MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); */ ??? for (rank = 0; rank < size; rank++) { ??? PetscPrintf(PETSC_COMM_WORLD,"local_size?? =? %d, on process %d\n", local_size, rank); ??? PetscPrintf(PETSC_COMM_WORLD,"rstart =? %d, on process %d\n", rstart, rank); ??? PetscPrintf(PETSC_COMM_WORLD,"rend =? %d, on process %d\n", rend, rank); ??? } ??? PetscFinalize(); The local size is 25 rows on each process but rstart and rend are 0 and 25 on all processes, I expected 0 and 25, 25 and 50, 50 and 75 and 75 and 101. N = 100 I can't spot the error. Any ideas, what's the problem? Klaus -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 23 12:25:13 2018 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 23 Nov 2018 13:25:13 -0500 Subject: [petsc-users] Incorrect local row ranges allocated to the processes i.e. rstart and rend are not what I expected In-Reply-To: <1384144592.1738166.1542996287805@mail.yahoo.com> References: <1721296158.1665556.1542984259829.ref@mail.yahoo.com> <1721296158.1665556.1542984259829@mail.yahoo.com> <1384144592.1738166.1542996287805@mail.yahoo.com> Message-ID: The other possibility is that you are using the "wrong" mpiexec. Matt On Fri, Nov 23, 2018 at 1:04 PM Klaus Burkart via petsc-users < petsc-users at mcs.anl.gov> wrote: > The output is: > > local_size = 25, on process 0 > rstart = 0, on process 0 > rend = 25, on process 0 > local_size = 25, on process 1 > rstart = 0, on process 1 > rend = 25, on process 1 > local_size = 25, on process 2 > rstart = 0, on process 2 > rend = 25, on process 2 > local_size = 25, on process 3 > rstart = 0, on process 3 > rend = 25, on process 3 > > Using PETSC_COMM_SELF has no effect. > > Am Freitag, 23. November 2018, 18:02:45 MEZ hat zakaryah < > zakaryah at gmail.com> Folgendes geschrieben: > > > What does your output look like? With PETSC_COMM_WORLD, I think you will > only get the output from the rank 0 process. Try with PETSC_COMM_SELF. > > On Fri, Nov 23, 2018, 11:56 AM Matthew Knepley via petsc-users < > petsc-users at mcs.anl.gov wrote: > > On Fri, Nov 23, 2018 at 9:54 AM Klaus Burkart via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > Hello, > > I am trying to compute the local row ranges allocated to the processes > i.e. rstart and rend of each process, needed as a prerequisite for > MatMPIAIJSetPreallocation using d_nnz and o_nnz. > > I tried the following: > > ... > > PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); > > MPI_Comm_size(PETSC_COMM_WORLD,&size); > MPI_Comm_rank(PETSC_COMM_WORLD,&rank); > > MatCreate(PETSC_COMM_WORLD,&A); > MatSetType(A,MATMPIAIJ); > PetscInt local_size = PETSC_DECIDE; > PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); > MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); > > > This looks right to me. Not sure what your problem is. However, you can > always use PetscLayout to do > this automatically. > > Thanks, > > Matt > > > rstart = rend - local_size; > PetscInt d_nnz[local_size], o_nnz[local_size]; > /* > > compute d_nnz and o_nnz here > > MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); > */ > > for (rank = 0; rank < size; rank++) { > PetscPrintf(PETSC_COMM_WORLD,"local_size = %d, on process %d\n", > local_size, rank); > PetscPrintf(PETSC_COMM_WORLD,"rstart = %d, on process %d\n", rstart, > rank); > PetscPrintf(PETSC_COMM_WORLD,"rend = %d, on process %d\n", rend, > rank); > } > > PetscFinalize(); > > The local size is 25 rows on each process but rstart and rend are 0 and 25 > on all processes, I expected 0 and 25, 25 and 50, 50 and 75 and 75 and 101. > N = 100 > > I can't spot the error. Any ideas, what's the problem? > > Klaus > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Nov 23 12:51:23 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 23 Nov 2018 18:51:23 +0000 Subject: [petsc-users] Incorrect local row ranges allocated to the processes i.e. rstart and rend are not what I expected In-Reply-To: <1721296158.1665556.1542984259829@mail.yahoo.com> References: <1721296158.1665556.1542984259829.ref@mail.yahoo.com> <1721296158.1665556.1542984259829@mail.yahoo.com> Message-ID: <961C0F6D-CE92-4A29-AC6E-878810E9A0FB@anl.gov> The correct answer is computed but you are printing out the answer all wrong. For PetscPrintf(PETSC_COMM_WORLD) only the FIRST process ever prints anything so you are having the first process print out the same values repeatedly. Don't have the loop over size in the code. You can use PetscSynchronizedPrintf() to have each process print its own values. Barry > On Nov 23, 2018, at 6:44 AM, Klaus Burkart via petsc-users wrote: > > Hello, > > I am trying to compute the local row ranges allocated to the processes i.e. rstart and rend of each process, needed as a prerequisite for MatMPIAIJSetPreallocation using d_nnz and o_nnz. > > I tried the following: > > ... > > PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); > > MPI_Comm_size(PETSC_COMM_WORLD,&size); > MPI_Comm_rank(PETSC_COMM_WORLD,&rank); > > MatCreate(PETSC_COMM_WORLD,&A); > MatSetType(A,MATMPIAIJ); > PetscInt local_size = PETSC_DECIDE; > PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); > MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); > rstart = rend - local_size; > PetscInt d_nnz[local_size], o_nnz[local_size]; > /* > > compute d_nnz and o_nnz here > > MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); > */ > > for (rank = 0; rank < size; rank++) { > PetscPrintf(PETSC_COMM_WORLD,"local_size = %d, on process %d\n", local_size, rank); > PetscPrintf(PETSC_COMM_WORLD,"rstart = %d, on process %d\n", rstart, rank); > PetscPrintf(PETSC_COMM_WORLD,"rend = %d, on process %d\n", rend, rank); > } > > PetscFinalize(); > > The local size is 25 rows on each process but rstart and rend are 0 and 25 on all processes, I expected 0 and 25, 25 and 50, 50 and 75 and 75 and 101. N = 100 > > I can't spot the error. Any ideas, what's the problem? > > Klaus From k_burkart at yahoo.com Fri Nov 23 13:39:46 2018 From: k_burkart at yahoo.com (Klaus Burkart) Date: Fri, 23 Nov 2018 19:39:46 +0000 (UTC) Subject: [petsc-users] Incorrect local row ranges allocated to the processes i.e. rstart and rend are not what I expected In-Reply-To: <961C0F6D-CE92-4A29-AC6E-878810E9A0FB@anl.gov> References: <1721296158.1665556.1542984259829.ref@mail.yahoo.com> <1721296158.1665556.1542984259829@mail.yahoo.com> <961C0F6D-CE92-4A29-AC6E-878810E9A0FB@anl.gov> Message-ID: <1662654833.1777263.1543001986480@mail.yahoo.com> ??? PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); ??? MPI_Comm_size(PETSC_COMM_WORLD,&size); ??? MPI_Comm_rank(PETSC_COMM_WORLD,&rank); ??? MatCreate(PETSC_COMM_WORLD,&A); ??? //MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N); ??? MatSetType(A,MATMPIAIJ); ??? PetscInt local_size = PETSC_DECIDE; ??? PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); ??? MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); ??? rstart = rend - local_size; ??? PetscInt d_nnz[local_size], o_nnz[local_size]; /* compute d_nnz and o_nnz here ??? MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); */ //*** ??? PetscSynchronizedPrintf(PETSC_COMM_WORLD,"local_size?? =? %d, on process %d\n", local_size, rank); ??? PetscSynchronizedPrintf(PETSC_COMM_WORLD,"rstart =? %d, on process %d\n", rstart, rank); ??? PetscSynchronizedPrintf(PETSC_COMM_WORLD,"rend =? %d, on process %d\n", rend, rank); ??? PetscFinalize(); Gives me : local_size?? =? 25, on process 0 rstart =? 0, on process 0 rend =? 25, on process 0 but there are 4 processes. Am Freitag, 23. November 2018, 19:51:26 MEZ hat Smith, Barry F. Folgendes geschrieben: ? The correct answer is computed but you are printing out the answer all wrong. ? For PetscPrintf(PETSC_COMM_WORLD) only the FIRST process ever prints anything so you are having the first process print out the same values repeatedly. ? ? Don't have the loop over size in the code. You can use PetscSynchronizedPrintf() to have each process print its own values. ? ? Barry > On Nov 23, 2018, at 6:44 AM, Klaus Burkart via petsc-users wrote: > > Hello, > > I am trying to compute the local row ranges allocated to the processes i.e. rstart and rend of each process, needed as a? prerequisite for MatMPIAIJSetPreallocation using d_nnz and o_nnz. > > I tried the following: > > ... > >? ? PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); > >? ? MPI_Comm_size(PETSC_COMM_WORLD,&size); >? ? MPI_Comm_rank(PETSC_COMM_WORLD,&rank); > >? ? MatCreate(PETSC_COMM_WORLD,&A); >? ? MatSetType(A,MATMPIAIJ); >? ? PetscInt local_size = PETSC_DECIDE; >? ? PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); >? ? MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); >? ? rstart = rend - local_size; >? ? PetscInt d_nnz[local_size], o_nnz[local_size]; > /* > > compute d_nnz and o_nnz here > >? ? MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); > */ > >? ? for (rank = 0; rank < size; rank++) { >? ? PetscPrintf(PETSC_COMM_WORLD,"local_size? =? %d, on process %d\n", local_size, rank); >? ? PetscPrintf(PETSC_COMM_WORLD,"rstart =? %d, on process %d\n", rstart, rank); >? ? PetscPrintf(PETSC_COMM_WORLD,"rend =? %d, on process %d\n", rend, rank); >? ? } > >? ? PetscFinalize(); > > The local size is 25 rows on each process but rstart and rend are 0 and 25 on all processes, I expected 0 and 25, 25 and 50, 50 and 75 and 75 and 101. N = 100 > > I can't spot the error. Any ideas, what's the problem? > > Klaus -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Fri Nov 23 14:24:12 2018 From: dave.mayhem23 at gmail.com (Dave May) Date: Fri, 23 Nov 2018 20:24:12 +0000 Subject: [petsc-users] Incorrect local row ranges allocated to the processes i.e. rstart and rend are not what I expected In-Reply-To: <1662654833.1777263.1543001986480@mail.yahoo.com> References: <1721296158.1665556.1542984259829.ref@mail.yahoo.com> <1721296158.1665556.1542984259829@mail.yahoo.com> <961C0F6D-CE92-4A29-AC6E-878810E9A0FB@anl.gov> <1662654833.1777263.1543001986480@mail.yahoo.com> Message-ID: On Fri, 23 Nov 2018 at 19:39, Klaus Burkart via petsc-users < petsc-users at mcs.anl.gov> wrote: > PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); > > MPI_Comm_size(PETSC_COMM_WORLD,&size); > MPI_Comm_rank(PETSC_COMM_WORLD,&rank); > > MatCreate(PETSC_COMM_WORLD,&A); > //MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N); > MatSetType(A,MATMPIAIJ); > PetscInt local_size = PETSC_DECIDE; > PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); > MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); > rstart = rend - local_size; > PetscInt d_nnz[local_size], o_nnz[local_size]; > /* > > compute d_nnz and o_nnz here > > MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); > */ > > //*** > > PetscSynchronizedPrintf(PETSC_COMM_WORLD,"local_size = %d, on > process %d\n", local_size, rank); > PetscSynchronizedPrintf(PETSC_COMM_WORLD,"rstart = %d, on process > %d\n", rstart, rank); > PetscSynchronizedPrintf(PETSC_COMM_WORLD,"rend = %d, on process > %d\n", rend, rank); > > Please read the manual page: https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscSynchronizedPrintf.html It explicitly states "REQUIRES a call to PetscSynchronizedFlush() by all the processes after the completion of the calls to PetscSynchronizedPrintf() for the information from all the processors to be printed." Thanks, Dave > PetscFinalize(); > > Gives me : > > local_size = 25, on process 0 > rstart = 0, on process 0 > rend = 25, on process 0 > > but there are 4 processes. > > Am Freitag, 23. November 2018, 19:51:26 MEZ hat Smith, Barry F. < > bsmith at mcs.anl.gov> Folgendes geschrieben: > > > > The correct answer is computed but you are printing out the answer all > wrong. > > For PetscPrintf(PETSC_COMM_WORLD) only the FIRST process ever prints > anything so you are having the first process print out the same values > repeatedly. > > Don't have the loop over size in the code. You can use > PetscSynchronizedPrintf() to have each process print its own values. > > Barry > > > > On Nov 23, 2018, at 6:44 AM, Klaus Burkart via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Hello, > > > > I am trying to compute the local row ranges allocated to the processes > i.e. rstart and rend of each process, needed as a prerequisite for > MatMPIAIJSetPreallocation using d_nnz and o_nnz. > > > > I tried the following: > > > > ... > > > > PetscInitialize(0,0,PETSC_NULL,PETSC_NULL); > > > > MPI_Comm_size(PETSC_COMM_WORLD,&size); > > MPI_Comm_rank(PETSC_COMM_WORLD,&rank); > > > > MatCreate(PETSC_COMM_WORLD,&A); > > MatSetType(A,MATMPIAIJ); > > PetscInt local_size = PETSC_DECIDE; > > PetscSplitOwnership(PETSC_COMM_WORLD, &local_size, &N); > > MPI_Scan(&local_size, &rend, 1, MPIU_INT, MPI_SUM, PETSC_COMM_WORLD); > > rstart = rend - local_size; > > PetscInt d_nnz[local_size], o_nnz[local_size]; > > /* > > > > compute d_nnz and o_nnz here > > > > MatMPIAIJSetPreallocation(A,0,d_nnz,0,o_nnz); > > */ > > > > for (rank = 0; rank < size; rank++) { > > PetscPrintf(PETSC_COMM_WORLD,"local_size = %d, on process %d\n", > local_size, rank); > > PetscPrintf(PETSC_COMM_WORLD,"rstart = %d, on process %d\n", rstart, > rank); > > PetscPrintf(PETSC_COMM_WORLD,"rend = %d, on process %d\n", rend, > rank); > > } > > > > PetscFinalize(); > > > > The local size is 25 rows on each process but rstart and rend are 0 and > 25 on all processes, I expected 0 and 25, 25 and 50, 50 and 75 and 75 and > 101. N = 100 > > > > I can't spot the error. Any ideas, what's the problem? > > > > Klaus > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jiaoyan.Li at inl.gov Mon Nov 26 10:41:29 2018 From: Jiaoyan.Li at inl.gov (Jiaoyan Li) Date: Mon, 26 Nov 2018 16:41:29 +0000 Subject: [petsc-users] PetscPartitioner is missing for fortran Message-ID: Dear Petsc Users: I am developing a Fortran code which uses Petsc APIs. But, seems the interface between Fortran and Petsc is not completed, as replied by Barry. Is there anyone may have some experience on building the Fortran interface for Petsc? Any suggestions or comments are highly appreciated. Thank you. Have a nice day, Jiaoyan ------------------------------------------------ On 11/21/18, 19:25, "Smith, Barry F." > wrote: Matt, PetscPartitioner is missing from lib/petsc/conf/bfort-petsc.txt Barry > On Nov 21, 2018, at 3:33 PM, Jiaoyan Li via petsc-users > wrote: > > Dear Petsc users: > > I am trying to use Petsc APIs for Fortran. One problem that I am facing right now is about the PetscPartitioner, i.e., > > #include ?petsc/finclude/petsc.h? > use petscdmplex > > PetscPartitioner :: part > PetscErrorCode :: ierr > > Call PetscPartitionerCreate(PETSC_COMM_WORLD, part, ierr) > > But, I got the error message as follows: > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Null argument, when expecting valid pointer > [0]PETSC ERROR: Null Pointer: Parameter # 2 > [0]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [0]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [0]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Null argument, when expecting valid pointer > [1]PETSC ERROR: Null Pointer: Parameter # 2 > [1]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [1]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [1]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [1]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [2]PETSC ERROR: Null argument, when expecting valid pointer > [2]PETSC ERROR: Null Pointer: Parameter # 2 > [2]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [2]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [2]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [2]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [2]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [3]PETSC ERROR: Null argument, when expecting valid pointer > [3]PETSC ERROR: Null Pointer: Parameter # 2 > [3]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [3]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [3]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [3]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [3]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > Is there anyone who may encounter similar problem before? Any suggestions or commends are highly appreciated. Thank you very much. > > Best, > > Jiaoyan -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Mon Nov 26 21:30:56 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Mon, 26 Nov 2018 21:30:56 -0600 Subject: [petsc-users] Example 23 of ksp problems Message-ID: Hello PETSc developers, I went through the ex23.c of ksp section attached herewith but I don't understand the following part: *********************************************** tol=1000.*PETSC_MACHINE_EPSILON ************************************************ and, ************************************************ ierr = VecAXPY(x,-1.0,u);CHKERRQ(ierr); ierr = VecNorm(x,NORM_2,&norm);CHKERRQ(ierr); ierr = KSPGetIterationNumber(ksp,&its);CHKERRQ(ierr); if (norm > tol) { ierr = PetscPrintf(PETSC_COMM_WORLD,"Norm of error %g, Iterations %D\n", (double)norm,its);CHKERRQ(ierr); } ************************************************ I don't understand what is "tol" here and "*PETSC_MACHINE_EPSILON"? The if condition is also not clear to me. Thanks. Sincerely, Huq -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ex23.c Type: text/x-csrc Size: 7674 bytes Desc: not available URL: From huq2090 at gmail.com Mon Nov 26 21:31:58 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Mon, 26 Nov 2018 21:31:58 -0600 Subject: [petsc-users] Problem to configure In-Reply-To: References: Message-ID: It works! Thanks. Sincerely, Huq On Tue, Nov 20, 2018 at 6:17 PM Balay, Satish wrote: > Try: > > rm -rf arch-linux2-c-debug > > And rebuild petsc. ie. redo configure and make > > Satish > > We don't know why this On Tue, 20 Nov 2018, Fazlul Huq via petsc-users > wrote: > > > Thanks. > > ./configure --download-hypre --download-mpich works well. > > But when I run the follwoing: > > make PETSC_DIR=/home/huq2090/petsc-3.10.2 PETSC_ARCH=arch-linux2-c-debug > all > > > > I got following errors: > > > $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$ > > CLINKER arch-linux2-c-debug/lib/libpetsc.so.3.10.2 > > arch-linux2-c-debug/obj/mat/impls/baij/seq/baij.o: In function > > `MatCreate_SeqBAIJ': > > /home/huq2090/petsc-3.10.2/src/mat/impls/baij/seq/baij.c:3049: undefined > > reference to `MatConvert_AIJ_HYPRE' > > /usr/bin/ld: arch-linux2-c-debug/obj/mat/impls/baij/seq/baij.o: > relocation > > R_X86_64_PC32 against undefined hidden symbol `MatConvert_AIJ_HYPRE' can > > not be used when making a shared object > > /usr/bin/ld: final link failed: Bad value > > collect2: error: ld returned 1 exit status > > gmakefile:86: recipe for target > > 'arch-linux2-c-debug/lib/libpetsc.so.3.10.2' failed > > make[2]: *** [arch-linux2-c-debug/lib/libpetsc.so.3.10.2] Error 1 > > make[2]: Leaving directory '/home/huq2090/petsc-3.10.2' > > /home/huq2090/petsc-3.10.2/lib/petsc/conf/rules:81: recipe for target > > 'gnumake' failed > > make[1]: *** [gnumake] Error 2 > > make[1]: Leaving directory '/home/huq2090/petsc-3.10.2' > > **************************ERROR************************************* > > Error during compile, check arch-linux2-c-debug/lib/petsc/conf/make.log > > Send it and arch-linux2-c-debug/lib/petsc/conf/configure.log to > > petsc-maint at mcs.anl.gov > > ******************************************************************** > > makefile:30: recipe for target 'all' failed > > make: *** [all] Error 1 > > > $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$ > > > > Again, the log file is attached herewith. > > > > Note: I have moose installed in my laptop but I have commented out from > > bashrc file > > #moose > > #if [ -f /opt/moose/environments/moose_profile ]; then > > # . /opt/moose/environments/moose_profile > > #fi > > > > Thanks. > > Sincerely, > > Huq > > > > On Tue, Nov 20, 2018 at 4:59 PM Balay, Satish wrote: > > > > > As the message suggests - you don't have MPI installed - so run: > > > > > > ./configure --download-hypre --download-mpich > > > > > > Satish > > > > > > On Tue, 20 Nov 2018, Fazlul Huq via petsc-users wrote: > > > > > > > Hello PETSc Developers, > > > > > > > > I am trying to configure PETSc with ./configure --download-hypre > > > > but I got the following message on terminal: > > > > > > > > > ======================================================================== > > > > Configuring PETSc to compile on your system > > > > > > > > > ========================================================================= > > > > TESTING: check from > > > > config.libraries(config/BuildSystem/config/libraries.py:158) > > > > > > > > > > > > ******************************************************************************* > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see > configure.log for > > > > details): > > > > > > > > ------------------------------------------------------------------------------- > > > > Unable to find mpi in default locations! > > > > Perhaps you can specify with --with-mpi-dir= > > > > If you do not want MPI, then give --with-mpi=0 > > > > You might also consider using --download-mpich instead > > > > > > > > ******************************************************************************* > > > > > > > > The log file is attached herewith. > > > > > > > > Thanks. > > > > Sincerely, > > > > Huq > > > > > > > > > > > > > > > > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Mon Nov 26 22:57:57 2018 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Tue, 27 Nov 2018 04:57:57 +0000 Subject: [petsc-users] Example 23 of ksp problems In-Reply-To: References: Message-ID: Huq: # define PETSC_MACHINE_EPSILON 2.2204460492503131e-16 See /petsc/include/petscmath.h Hong Hello PETSc developers, I went through the ex23.c of ksp section attached herewith but I don't understand the following part: *********************************************** tol=1000.*PETSC_MACHINE_EPSILON ************************************************ and, ************************************************ ierr = VecAXPY(x,-1.0,u);CHKERRQ(ierr); ierr = VecNorm(x,NORM_2,&norm);CHKERRQ(ierr); ierr = KSPGetIterationNumber(ksp,&its);CHKERRQ(ierr); if (norm > tol) { ierr = PetscPrintf(PETSC_COMM_WORLD,"Norm of error %g, Iterations %D\n", (double)norm,its);CHKERRQ(ierr); } ************************************************ I don't understand what is "tol" here and "*PETSC_MACHINE_EPSILON"? The if condition is also not clear to me. Thanks. Sincerely, Huq -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Mon Nov 26 23:08:02 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Mon, 26 Nov 2018 23:08:02 -0600 Subject: [petsc-users] Example 23 of ksp problems In-Reply-To: References: Message-ID: Thanks. When I checked the file I got following values (Link: file:///home/huq2090/petsc-3.10.2/include/petscmath.h.html). I don't know why I should consider the value you have mentioned? # define PETSC_MACHINE_EPSILON 1.19209290e-07F # define PETSC_MACHINE_EPSILON 2.2204460492503131e-16 # define PETSC_MACHINE_EPSILON FLT128_EPSILON # define PETSC_MACHINE_EPSILON .00097656 Again, I don't understand the following part: *********************************************** tol=1000.*PETSC_MACHINE_EPSILON ************************************************ and, ************************************************ ierr = VecAXPY(x,-1.0,u);CHKERRQ(ierr); ierr = VecNorm(x,NORM_2,&norm);CHKERRQ(ierr); ierr = KSPGetIterationNumber(ksp,&its);CHKERRQ(ierr); if (norm > tol) { ierr = PetscPrintf(PETSC_COMM_WORLD,"Norm of error %g, Iterations %D\n", (double)norm,its);CHKERRQ(ierr); } ************************************************ I don't understand what is "tol" here? The if condition is also not clear to me. Thanks. Sincerely, Huq On Mon, Nov 26, 2018 at 10:58 PM Zhang, Hong wrote: > Huq: > # define PETSC_MACHINE_EPSILON 2.2204460492503131e-16 > See /petsc/include/petscmath.h > Hong > > Hello PETSc developers, >> >> I went through the ex23.c of ksp section attached herewith but I don't >> understand the following part: >> *********************************************** >> tol=1000.*PETSC_MACHINE_EPSILON >> ************************************************ >> and, >> ************************************************ >> ierr = VecAXPY(x,-1.0,u);CHKERRQ(ierr); >> ierr = VecNorm(x,NORM_2,&norm);CHKERRQ(ierr); >> ierr = KSPGetIterationNumber(ksp,&its);CHKERRQ(ierr); >> if (norm > tol) { >> ierr = PetscPrintf(PETSC_COMM_WORLD,"Norm of error %g, Iterations >> %D\n", (double)norm,its);CHKERRQ(ierr); >> } >> ************************************************ >> I don't understand what is "tol" here and "*PETSC_MACHINE_EPSILON"? >> The if condition is also not clear to me. >> >> Thanks. >> Sincerely, >> Huq >> -- >> >> Fazlul Huq >> Graduate Research Assistant >> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> University of Illinois at Urbana-Champaign (UIUC) >> E-mail: huq2090 at gmail.com >> > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From C.Klaij at marin.nl Tue Nov 27 01:44:19 2018 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Tue, 27 Nov 2018 07:44:19 +0000 Subject: [petsc-users] PetscPartitioner is missing for fortran In-Reply-To: References: Message-ID: <1543304659142.46737@marin.nl> Hi Jiaoyan, I've been using the fortran interface since 2004. During these years I only found a handful of things missing. After an email to this list the developers are usually quite fast in fixing the problem. Chris Date: Mon, 26 Nov 2018 16:41:29 +0000 From: Jiaoyan Li To: "petsc-users at mcs.anl.gov" Subject: [petsc-users] PetscPartitioner is missing for fortran Message-ID: Content-Type: text/plain; charset="utf-8" Dear Petsc Users: I am developing a Fortran code which uses Petsc APIs. But, seems the interface between Fortran and Petsc is not completed, as replied by Barry. Is there anyone may have some experience on building the Fortran interface for Petsc? Any suggestions or comments are highly appreciated. Thank you. Have a nice day, Jiaoyan ------------------------------------------------ On 11/21/18, 19:25, "Smith, Barry F." > wrote: Matt, PetscPartitioner is missing from lib/petsc/conf/bfort-petsc.txt Barry > On Nov 21, 2018, at 3:33 PM, Jiaoyan Li via petsc-users > wrote: > > Dear Petsc users: > > I am trying to use Petsc APIs for Fortran. One problem that I am facing right now is about the PetscPartitioner, i.e., > > #include ?petsc/finclude/petsc.h? > use petscdmplex > > PetscPartitioner :: part > PetscErrorCode :: ierr > > Call PetscPartitionerCreate(PETSC_COMM_WORLD, part, ierr) > > But, I got the error message as follows: > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Null argument, when expecting valid pointer > [0]PETSC ERROR: Null Pointer: Parameter # 2 > [0]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [0]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [0]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Null argument, when expecting valid pointer > [1]PETSC ERROR: Null Pointer: Parameter # 2 > [1]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [1]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [1]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [1]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [2]PETSC ERROR: Null argument, when expecting valid pointer > [2]PETSC ERROR: Null Pointer: Parameter # 2 > [2]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [2]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [2]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [2]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [2]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [3]PETSC ERROR: Null argument, when expecting valid pointer > [3]PETSC ERROR: Null Pointer: Parameter # 2 > [3]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [3]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [3]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [3]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [3]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > Is there anyone who may encounter similar problem before? Any suggestions or commends are highly appreciated. Thank you very much. > > Best, > > Jiaoyan -------------- next part -------------- An HTML attachment was scrubbed... URL: dr. ir. Christiaan Klaij | Senior Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Maritime-with-a-Mission-TKI-Event-op-het-SS-Rotterdam-30-november.htm From tempohoper at gmail.com Tue Nov 27 03:29:44 2018 From: tempohoper at gmail.com (Sal Am) Date: Tue, 27 Nov 2018 09:29:44 +0000 Subject: [petsc-users] Solving complex linear sparse matrix in parallel + external library In-Reply-To: References: Message-ID: > > This can happen if you use an 'mpiexec' which is from a different MPI than > the one you compiled PETSc with. > that is odd, I tried removing the --download-mpich from the config and tried --with-mpi=1 (which should be default anyways) and retried it with --with-mpich=1. Current reconfig file: '--download-mpich', '--download-mumps', '--download-scalapack', '--download-superlu_dist', '--with-cc=gcc', '--with-clanguage=cxx', '--with-cxx=g++', '--with-debugging=no', '--with-fc=gfortran', '--with-mpi=1', '--with-mpich=1', '--with-scalar-type=complex', 'PETSC_ARCH=linux-opt' Still does not work though, I tried executing ex11 in ksp/ksp/example/tutorial/ex11 which should solve a linear system in parallel by running mpiexec -n 2 but it prints out Mat Object: 1 MPI processes ... .. twice. What am I missing? Why would you want a minimum iterations? Why not set a tolerance and a max? > What would you achieve with a minimum? > I thought PETSc might not be iterating far enough, but after having had a look at KSPSetTolerances it makes more sense. Instead of 'preonly', where you do not want a residual, use 'richardson' or > 'gmres' with a max of 1 iterate (-ksp_max_it 1) > So I tried that by executing: mpiexec -n 4 ./test -ksp_type richardson -pc_type lu -pc_factor_mat_solver_type mumps -ksp_max_it 1 on the command line. However, nothing prints out on the terminal (aside from PetscPrintf if I have them enabled). Thank you. On Fri, Nov 16, 2018 at 12:00 PM Matthew Knepley wrote: > On Fri, Nov 16, 2018 at 4:23 AM Sal Am via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Hi, >> >> I have a few questions: >> >> 1. The following issue/misunderstanding: >> My code reads in two files one PETSc vector and one PETSc matrix (b and A >> from Ax=b, size ~65000x65000). >> and then calls KSP solver to solve it by running the following in the >> terminal: >> >> mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu >> -pc_factor_mat_solver mumps >> >> Now mumps is supposed to work in parallel and complex, but the code is >> not solved in parallel it seems. It just prints the result twice. Adding >> -log_view gives me >> >> "./SolveSys on a linux-opt named F8434 with 1 processor..." printed twice. >> > > This can happen if you use an 'mpiexec' which is from a different MPI than > the one you compiled PETSc with. > > >> 2. Using iterative solvers, I am having difficulty getting convergence. I >> found that there is a way to set the maximum number of iterations, but is >> there a minimum I can increase? >> > > Why would you want a minimum iterations? Why not set a tolerance and a > max? What would you achieve with a minimum? > > >> 3. The residual is not computed when using direct external solvers. What >> is proper PETSc way of doing this? >> > > Instead of 'preonly', where you do not want a residual, use 'richardson' > or 'gmres' with a max of 1 iterate (-ksp_max_it 1) > > Thanks, > > Matt > > >> -----------------------------------------------------The >> code--------------------------------------- >> #include >> #include >> int main(int argc,char **args) >> { >> Vec x,b; /* approx solution, RHS */ >> Mat A; /* linear system matrix */ >> KSP ksp; /* linear solver context */ >> PetscReal norm; /* norm of solution error */ >> PC pc; >> PetscMPIInt rank, size; >> PetscViewer viewer; >> PetscInt its, i; >> PetscErrorCode ierr; >> PetscScalar *xa; >> PetscBool flg = PETSC_FALSE; >> >> ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return ierr; >> MPI_Comm_rank(PETSC_COMM_WORLD,&rank); >> MPI_Comm_size(PETSC_COMM_WORLD,&size); >> >> #if !defined(PETSC_USE_COMPLEX) >> SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex numbers"); >> #endif >> /* >> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> Compute the matrix and right-hand-side vector that define >> the linear system, Ax = b. >> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> */ >> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from >> Vector_b.dat ...\n");CHKERRQ(ierr); >> ierr = >> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >> ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr); >> ierr = VecLoad(b,viewer); CHKERRQ(ierr); >> >> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from >> Matrix_A.dat ...\n");CHKERRQ(ierr); >> ierr = >> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >> ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); >> ierr = MatLoad(A,viewer);CHKERRQ(ierr); >> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); >> >> ierr = VecDuplicate(b,&x);CHKERRQ(ierr); >> >> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> Create the linear solver and set various options >> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> */ >> PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n"); >> ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); >> >> >> PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n"); >> ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr); >> >> ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); >> ierr = PCSetType(pc, PCLU);CHKERRQ(ierr); >> ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); >> >> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> Solve the linear system >> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> */ >> ierr = KSPSetUp(ksp);CHKERRQ(ierr); >> ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr); >> ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); >> PetscPrintf(PETSC_COMM_WORLD, "Solved"); >> >> /* >> Free work space. All PETSc objects should be destroyed when they >> are no longer needed. >> */ >> ierr = KSPDestroy(&ksp);CHKERRQ(ierr); >> ierr = VecDestroy(&x);CHKERRQ(ierr); >> ierr = VecDestroy(&b);CHKERRQ(ierr); >> ierr = MatDestroy(&A);CHKERRQ(ierr); >> ierr = PetscFinalize(); >> return ierr; >> } >> >> >> Kind regards, >> Sal >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Nov 27 04:53:45 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 27 Nov 2018 05:53:45 -0500 Subject: [petsc-users] Solving complex linear sparse matrix in parallel + external library In-Reply-To: References: Message-ID: On Tue, Nov 27, 2018 at 4:29 AM Sal Am wrote: > This can happen if you use an 'mpiexec' which is from a different MPI than >> the one you compiled PETSc with. >> > > that is odd, I tried removing the --download-mpich from the config and > tried --with-mpi=1 (which should be default anyways) and retried it with > --with-mpich=1. > Your MPI is still broken. Some default MPIs, like those installed with Apple, are broken. If you are on Apple I would recommend using --download-mpich, but make sure you use $PETSC_DIR/$PETSC_ARCH/bin/mpiexec. If you reconfigure, you either have to specify a different --PETSC_ARCH, or delete the $PETSC_DIR/$PETSC_ARCH completely. Otherwise, you can get bad interaction with the libraries already sitting there. Thanks, Matt > Current reconfig file: > '--download-mpich', > '--download-mumps', > '--download-scalapack', > '--download-superlu_dist', > '--with-cc=gcc', > '--with-clanguage=cxx', > '--with-cxx=g++', > '--with-debugging=no', > '--with-fc=gfortran', > '--with-mpi=1', > '--with-mpich=1', > '--with-scalar-type=complex', > 'PETSC_ARCH=linux-opt' > Still does not work though, I tried executing ex11 in > ksp/ksp/example/tutorial/ex11 which should solve a linear system in > parallel by running mpiexec -n 2 but it prints out > Mat Object: 1 MPI processes > ... > .. > twice. > What am I missing? > > Why would you want a minimum iterations? Why not set a tolerance and a >> max? What would you achieve with a minimum? >> > > I thought PETSc might not be iterating far enough, but after having had a > look at KSPSetTolerances it makes more sense. > > Instead of 'preonly', where you do not want a residual, use 'richardson' >> or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >> > > So I tried that by executing: mpiexec -n 4 ./test -ksp_type richardson > -pc_type lu -pc_factor_mat_solver_type mumps -ksp_max_it 1 > on the command line. However, nothing prints out on the terminal (aside > from PetscPrintf if I have them enabled). > > Thank you. > > On Fri, Nov 16, 2018 at 12:00 PM Matthew Knepley > wrote: > >> On Fri, Nov 16, 2018 at 4:23 AM Sal Am via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Hi, >>> >>> I have a few questions: >>> >>> 1. The following issue/misunderstanding: >>> My code reads in two files one PETSc vector and one PETSc matrix (b and >>> A from Ax=b, size ~65000x65000). >>> and then calls KSP solver to solve it by running the following in the >>> terminal: >>> >>> mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu >>> -pc_factor_mat_solver mumps >>> >>> Now mumps is supposed to work in parallel and complex, but the code is >>> not solved in parallel it seems. It just prints the result twice. Adding >>> -log_view gives me >>> >>> "./SolveSys on a linux-opt named F8434 with 1 processor..." printed >>> twice. >>> >> >> This can happen if you use an 'mpiexec' which is from a different MPI >> than the one you compiled PETSc with. >> >> >>> 2. Using iterative solvers, I am having difficulty getting convergence. >>> I found that there is a way to set the maximum number of iterations, but is >>> there a minimum I can increase? >>> >> >> Why would you want a minimum iterations? Why not set a tolerance and a >> max? What would you achieve with a minimum? >> >> >>> 3. The residual is not computed when using direct external solvers. What >>> is proper PETSc way of doing this? >>> >> >> Instead of 'preonly', where you do not want a residual, use 'richardson' >> or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >> >> Thanks, >> >> Matt >> >> >>> -----------------------------------------------------The >>> code--------------------------------------- >>> #include >>> #include >>> int main(int argc,char **args) >>> { >>> Vec x,b; /* approx solution, RHS */ >>> Mat A; /* linear system matrix */ >>> KSP ksp; /* linear solver context */ >>> PetscReal norm; /* norm of solution error */ >>> PC pc; >>> PetscMPIInt rank, size; >>> PetscViewer viewer; >>> PetscInt its, i; >>> PetscErrorCode ierr; >>> PetscScalar *xa; >>> PetscBool flg = PETSC_FALSE; >>> >>> ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return >>> ierr; >>> MPI_Comm_rank(PETSC_COMM_WORLD,&rank); >>> MPI_Comm_size(PETSC_COMM_WORLD,&size); >>> >>> #if !defined(PETSC_USE_COMPLEX) >>> SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex numbers"); >>> #endif >>> /* >>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>> Compute the matrix and right-hand-side vector that define >>> the linear system, Ax = b. >>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>> */ >>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from >>> Vector_b.dat ...\n");CHKERRQ(ierr); >>> ierr = >>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>> ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr); >>> ierr = VecLoad(b,viewer); CHKERRQ(ierr); >>> >>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from >>> Matrix_A.dat ...\n");CHKERRQ(ierr); >>> ierr = >>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>> ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); >>> ierr = MatLoad(A,viewer);CHKERRQ(ierr); >>> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); >>> >>> ierr = VecDuplicate(b,&x);CHKERRQ(ierr); >>> >>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>> Create the linear solver and set various options >>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>> */ >>> PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n"); >>> ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); >>> >>> >>> PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n"); >>> ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr); >>> >>> ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); >>> ierr = PCSetType(pc, PCLU);CHKERRQ(ierr); >>> ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); >>> >>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>> Solve the linear system >>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>> */ >>> ierr = KSPSetUp(ksp);CHKERRQ(ierr); >>> ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr); >>> ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); >>> PetscPrintf(PETSC_COMM_WORLD, "Solved"); >>> >>> /* >>> Free work space. All PETSc objects should be destroyed when they >>> are no longer needed. >>> */ >>> ierr = KSPDestroy(&ksp);CHKERRQ(ierr); >>> ierr = VecDestroy(&x);CHKERRQ(ierr); >>> ierr = VecDestroy(&b);CHKERRQ(ierr); >>> ierr = MatDestroy(&A);CHKERRQ(ierr); >>> ierr = PetscFinalize(); >>> return ierr; >>> } >>> >>> >>> Kind regards, >>> Sal >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Tue Nov 27 05:23:48 2018 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Tue, 27 Nov 2018 12:23:48 +0100 Subject: [petsc-users] Compile petsc using intel mpi Message-ID: Dear users, I have installed intel parallel studio on my workstation and thus I would like to take advantage of intel compiler. Before messing up my installation, have you got some guidelines to survive at this attempt? I have found here in the mailing list the following instructions: --with-cc=icc --with-fc=ifort --with-mpi-include=/path-to-intel --with-mpi-lib=/path-to-intel Are they correct? Also I have an already existing and clean installation of petsc using openmpi. I would like to retain this installtion since it is working very well and switching between the two somehow. Any tips on this? I will never stop to say thank you for your precious support! Edoardo ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica, Universita' degli Studi di Genova, 1, via Montallegro, 16145 Genova, Italy -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailinglists at xgm.de Tue Nov 27 09:16:43 2018 From: mailinglists at xgm.de (Florian Lindner) Date: Tue, 27 Nov 2018 16:16:43 +0100 Subject: [petsc-users] IS Invert Not Permutation Message-ID: <63cd6b99-5b2a-1b61-a006-d9b96bde9830@xgm.de> Hello, I have a range of local input data indices that I want to use for row indexing, say { 3, 4, 6}. For that, I create a matrix with a local number of rows of 3 and map the indices {3, 4, 5} to these rows. I create an index set: ISCreateGeneral(comm, myIndizes.size(), myIndizes.data(), PETSC_COPY_VALUES, &ISlocal); ISSetPermutation(ISlocal); ISInvertPermutation(ISlocal, myIndizes.size(), &ISlocalInv); ISAllGather(ISlocalInv, &ISglobal); // Gather the IS from all processors ISLocalToGlobalMappingCreateIS(ISglobal, &ISmapping); // Make it a mapping MatSetLocalToGlobalMapping(matrix, ISmapping, ISmapping); // Set mapping for rows and cols The InvertPermutation is required because, well, it seems that the direction the mapping stuff works in PETSc. Now I can use MatSetValues local to set rows {3, 4, 5} and actually set the local rows {1, 2, 3}. The problem is that the range of data vertices is not always contiguous, i.e. could be {3, 4, 6}. This seems to be not a permutation of PETSc, therefore ISSetPermutation fails and subsequently ISInvertPermutation. How can I still create such a mapping, so that: 3 -> 1 4 -> 2 6 -> 6 row 5 is unassigned and will never be addressed. Thanks! Florian From natacha.bereux at gmail.com Tue Nov 27 09:28:31 2018 From: natacha.bereux at gmail.com (Natacha BEREUX) Date: Tue, 27 Nov 2018 16:28:31 +0100 Subject: [petsc-users] Fortran interface of some petsc routines seem to be missing Message-ID: Hello, I work on a Fortran software that uses PETSc for linear solvers. Therefore, we have a PETSc interface to convert our matrices to PETSc Mat. I have noticed several compiler warnings ( I use gfortran with -Wimplicit-interface) during compilation. The warnings point out that some (but not all) fortran interfaces are missing. The behaviour is the same when I compile a PETSC example. Below is src/vec/vec/examples/tutorials/ex9f.F compiled with gfortran. mpif90 -c -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -Wimplicit-interface -g -O -I/home/H03755/Librairies/petsc-3.10.1/include -I/home/H03755/Librairies/petsc-3.10.1/linux-opt-mumps-ml-hypre-superlu/include -I/home/H03755/dev/codeaster-prerequisites/v14/prerequisites/Mumps-512_consortium_aster3/MPI/include -I/home/H03755/local/petsc/petsc-3.10.1/include -o ex9f.o ex9f.F90 ex9f.F90:34.53: call PetscInitialize(PETSC_NULL_CHARACTER,ierr) 1 Warning: Procedure 'petscinitialize' called with an implicit interface at (1) ex9f.F90:42.91: if (size .ne. 2) then; call PetscError(PETSC_COMM_WORLD,1,0,'Requires 2 processors'); call MPIU_Abort(PETSC_COMM_WORLD,1); endif 1 Warning: Procedure 'petscerror' called with an implicit interface at (1) ex9f.F90:42.128: if (size .ne. 2) then; call PetscError(PETSC_COMM_WORLD,1,0,'Requires 2 processors'); call MPIU_Abort(PETSC_COMM_WORLD,1); endif 1 Warning: Procedure 'mpiu_abort' called with an implicit interface at (1) ex9f.F90:81.56: & PETSC_DECIDE,nghost,ifrom,tarray,gxs,ierr) 1 Warning: Procedure 'veccreateghostwitharray' called with an implicit interface at (1) ex9f.F90:99.53: call VecGetOwnershipRange(gx,rstart,rend,ierr) 1 Warning: Procedure 'vecgetownershiprange' called with an implicit interface at (1) ex9f.F90:115.93: call PetscViewerGetSubViewer(PETSC_VIEWER_STDOUT_WORLD,PETSC_COMM_SELF,subviewer,ierr) 1 Warning: Procedure 'petscviewergetsubviewer' called with an implicit interface at (1) ex9f.F90:117.97: call PetscViewerRestoreSubViewer(PETSC_VIEWER_STDOUT_WORLD,PETSC_COMM_SELF,subviewer,ierr) 1 Warning: Procedure 'petscviewerrestoresubviewer' called with an implicit interface at (1) ex9f.F90:121.31: call PetscFinalize(ierr) 1 Warning: Procedure 'petscfinalize' called with an implicit interface at (1) Why does the compiler complain ? Did I miss something when I compiled PETSc library ? Is there a way to properly generate all the Fortran interfaces in the compiled library ? Or is it normal that PETSc only generates some interfaces but not all ? In this case, is there a way to know which interfaces are explicitly and automatically defined in PETSc library ? So that I can provide manually the missing ones in my code ? Thanks a lot for your help ! Best regards, Natacha -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Nov 27 09:33:56 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 27 Nov 2018 10:33:56 -0500 Subject: [petsc-users] IS Invert Not Permutation In-Reply-To: <63cd6b99-5b2a-1b61-a006-d9b96bde9830@xgm.de> References: <63cd6b99-5b2a-1b61-a006-d9b96bde9830@xgm.de> Message-ID: On Tue, Nov 27, 2018 at 10:17 AM Florian Lindner via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hello, > > I have a range of local input data indices that I want to use for row > indexing, say { 3, 4, 6}. > > For that, I create a matrix with a local number of rows of 3 and map the > indices {3, 4, 5} to these rows. > It seems like you are trying to create a mapping and its inverse. We do that in https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/AOCreateMapping.html Note that this is not really scalable. Usually there is a way to restructure the code to avoid this. Thanks, Matt > I create an index set: > > ISCreateGeneral(comm, myIndizes.size(), myIndizes.data(), > PETSC_COPY_VALUES, &ISlocal); > ISSetPermutation(ISlocal); > ISInvertPermutation(ISlocal, myIndizes.size(), &ISlocalInv); > ISAllGather(ISlocalInv, &ISglobal); // Gather the IS from all processors > ISLocalToGlobalMappingCreateIS(ISglobal, &ISmapping); // Make it a mapping > > MatSetLocalToGlobalMapping(matrix, ISmapping, ISmapping); // Set mapping > for rows and cols > > The InvertPermutation is required because, well, it seems that the > direction the mapping stuff works in PETSc. > > Now I can use MatSetValues local to set rows {3, 4, 5} and actually set > the local rows {1, 2, 3}. > > The problem is that the range of data vertices is not always contiguous, > i.e. could be {3, 4, 6}. This seems to be not a permutation of PETSc, > therefore ISSetPermutation fails and subsequently ISInvertPermutation. > > How can I still create such a mapping, so that: > > 3 -> 1 > 4 -> 2 > 6 -> 6 > > row 5 is unassigned and will never be addressed. > > Thanks! > Florian > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jiaoyan.Li at inl.gov Tue Nov 27 09:50:53 2018 From: Jiaoyan.Li at inl.gov (Jiaoyan Li) Date: Tue, 27 Nov 2018 15:50:53 +0000 Subject: [petsc-users] PetscPartitioner is missing for fortran Message-ID: <45FA1B16-AAA5-4A0F-927C-9786A6431F89@inl.gov> Thanks, Chris, for your kind reply. Yes, I believe the mailing list is the most convenient way for Petsc's to communicate with each other. Also, I am wondering if you may happen to use "PetscPartitioner" before? Have you ever tried to fix Petsc-Fortran problem by yourself before? I believe Petsc needs to build an interface for Fortran from the C source code. But, I don't know where Petsc does that. I am just trying to see if I can fix it by myself. Thank you. Jiaoyan ?On 11/27/18, 00:44, "petsc-users on behalf of Klaij, Christiaan via petsc-users" wrote: Hi Jiaoyan, I've been using the fortran interface since 2004. During these years I only found a handful of things missing. After an email to this list the developers are usually quite fast in fixing the problem. Chris Date: Mon, 26 Nov 2018 16:41:29 +0000 From: Jiaoyan Li To: "petsc-users at mcs.anl.gov" Subject: [petsc-users] PetscPartitioner is missing for fortran Message-ID: Content-Type: text/plain; charset="utf-8" Dear Petsc Users: I am developing a Fortran code which uses Petsc APIs. But, seems the interface between Fortran and Petsc is not completed, as replied by Barry. Is there anyone may have some experience on building the Fortran interface for Petsc? Any suggestions or comments are highly appreciated. Thank you. Have a nice day, Jiaoyan ------------------------------------------------ On 11/21/18, 19:25, "Smith, Barry F." > wrote: Matt, PetscPartitioner is missing from lib/petsc/conf/bfort-petsc.txt Barry > On Nov 21, 2018, at 3:33 PM, Jiaoyan Li via petsc-users > wrote: > > Dear Petsc users: > > I am trying to use Petsc APIs for Fortran. One problem that I am facing right now is about the PetscPartitioner, i.e., > > #include ?petsc/finclude/petsc.h? > use petscdmplex > > PetscPartitioner :: part > PetscErrorCode :: ierr > > Call PetscPartitionerCreate(PETSC_COMM_WORLD, part, ierr) > > But, I got the error message as follows: > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Null argument, when expecting valid pointer > [0]PETSC ERROR: Null Pointer: Parameter # 2 > [0]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [0]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [0]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Null argument, when expecting valid pointer > [1]PETSC ERROR: Null Pointer: Parameter # 2 > [1]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [1]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [1]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [1]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [2]PETSC ERROR: Null argument, when expecting valid pointer > [2]PETSC ERROR: Null Pointer: Parameter # 2 > [2]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [2]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [2]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [2]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [2]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [3]PETSC ERROR: Null argument, when expecting valid pointer > [3]PETSC ERROR: Null Pointer: Parameter # 2 > [3]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [3]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [3]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [3]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [3]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > Is there anyone who may encounter similar problem before? Any suggestions or commends are highly appreciated. Thank you very much. > > Best, > > Jiaoyan -------------- next part -------------- An HTML attachment was scrubbed... URL: dr. ir. Christiaan Klaij | Senior Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | https://urldefense.proofpoint.com/v2/url?u=http-3A__www.marin.nl&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=uadiAPfKAyRcJODTiOtF9X7aTXnk7oBkmdranjsZy8o&s=X4nF6ffp6hFk5qLA9Q_KN8PjMOU5-rQMhKxQ4j5gIVo&e= MARIN news: https://urldefense.proofpoint.com/v2/url?u=http-3A__www.marin.nl_web_News_News-2Ditems_Maritime-2Dwith-2Da-2DMission-2DTKI-2DEvent-2Dop-2Dhet-2DSS-2DRotterdam-2D30-2Dnovember.htm&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=uadiAPfKAyRcJODTiOtF9X7aTXnk7oBkmdranjsZy8o&s=2K4w9VlCutEzg1zz_NVlsKwk8ECfHknpdmXzF4hyBfo&e= From C.Klaij at marin.nl Tue Nov 27 09:59:01 2018 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Tue, 27 Nov 2018 15:59:01 +0000 Subject: [petsc-users] PetscPartitioner is missing for fortran In-Reply-To: <45FA1B16-AAA5-4A0F-927C-9786A6431F89@inl.gov> References: <45FA1B16-AAA5-4A0F-927C-9786A6431F89@inl.gov> Message-ID: <1543334341162.93707@marin.nl> Personally, I never try to fix these things by myself, it's the job of the petsc developers, they know best and can make the fix available for all users. As a user, I just give feedback whenever I encounter a problem (which isn't often). Chris dr. ir. Christiaan Klaij | Senior Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/DNVGL-certification-for-DOLPHIN-simulator-software.htm ________________________________________ From: Jiaoyan Li Sent: Tuesday, November 27, 2018 4:50 PM To: Klaij, Christiaan; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PetscPartitioner is missing for fortran Thanks, Chris, for your kind reply. Yes, I believe the mailing list is the most convenient way for Petsc's to communicate with each other. Also, I am wondering if you may happen to use "PetscPartitioner" before? Have you ever tried to fix Petsc-Fortran problem by yourself before? I believe Petsc needs to build an interface for Fortran from the C source code. But, I don't know where Petsc does that. I am just trying to see if I can fix it by myself. Thank you. Jiaoyan ?On 11/27/18, 00:44, "petsc-users on behalf of Klaij, Christiaan via petsc-users" wrote: Hi Jiaoyan, I've been using the fortran interface since 2004. During these years I only found a handful of things missing. After an email to this list the developers are usually quite fast in fixing the problem. Chris Date: Mon, 26 Nov 2018 16:41:29 +0000 From: Jiaoyan Li To: "petsc-users at mcs.anl.gov" Subject: [petsc-users] PetscPartitioner is missing for fortran Message-ID: Content-Type: text/plain; charset="utf-8" Dear Petsc Users: I am developing a Fortran code which uses Petsc APIs. But, seems the interface between Fortran and Petsc is not completed, as replied by Barry. Is there anyone may have some experience on building the Fortran interface for Petsc? Any suggestions or comments are highly appreciated. Thank you. Have a nice day, Jiaoyan ------------------------------------------------ On 11/21/18, 19:25, "Smith, Barry F." > wrote: Matt, PetscPartitioner is missing from lib/petsc/conf/bfort-petsc.txt Barry > On Nov 21, 2018, at 3:33 PM, Jiaoyan Li via petsc-users > wrote: > > Dear Petsc users: > > I am trying to use Petsc APIs for Fortran. One problem that I am facing right now is about the PetscPartitioner, i.e., > > #include ?petsc/finclude/petsc.h? > use petscdmplex > > PetscPartitioner :: part > PetscErrorCode :: ierr > > Call PetscPartitionerCreate(PETSC_COMM_WORLD, part, ierr) > > But, I got the error message as follows: > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Null argument, when expecting valid pointer > [0]PETSC ERROR: Null Pointer: Parameter # 2 > [0]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [0]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [0]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Null argument, when expecting valid pointer > [1]PETSC ERROR: Null Pointer: Parameter # 2 > [1]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [1]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [1]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [1]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [2]PETSC ERROR: Null argument, when expecting valid pointer > [2]PETSC ERROR: Null Pointer: Parameter # 2 > [2]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [2]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [2]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [2]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [2]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [3]PETSC ERROR: Null argument, when expecting valid pointer > [3]PETSC ERROR: Null Pointer: Parameter # 2 > [3]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [3]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [3]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [3]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [3]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > Is there anyone who may encounter similar problem before? Any suggestions or commends are highly appreciated. Thank you very much. > > Best, > > Jiaoyan -------------- next part -------------- An HTML attachment was scrubbed... URL: dr. ir. Christiaan Klaij | Senior Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | https://urldefense.proofpoint.com/v2/url?u=http-3A__www.marin.nl&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=uadiAPfKAyRcJODTiOtF9X7aTXnk7oBkmdranjsZy8o&s=X4nF6ffp6hFk5qLA9Q_KN8PjMOU5-rQMhKxQ4j5gIVo&e= MARIN news: https://urldefense.proofpoint.com/v2/url?u=http-3A__www.marin.nl_web_News_News-2Ditems_Maritime-2Dwith-2Da-2DMission-2DTKI-2DEvent-2Dop-2Dhet-2DSS-2DRotterdam-2D30-2Dnovember.htm&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=uadiAPfKAyRcJODTiOtF9X7aTXnk7oBkmdranjsZy8o&s=2K4w9VlCutEzg1zz_NVlsKwk8ECfHknpdmXzF4hyBfo&e= From knepley at gmail.com Tue Nov 27 10:05:34 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 27 Nov 2018 11:05:34 -0500 Subject: [petsc-users] PetscPartitioner is missing for fortran In-Reply-To: <45FA1B16-AAA5-4A0F-927C-9786A6431F89@inl.gov> References: <45FA1B16-AAA5-4A0F-927C-9786A6431F89@inl.gov> Message-ID: On Tue, Nov 27, 2018 at 10:51 AM Jiaoyan Li via petsc-users < petsc-users at mcs.anl.gov> wrote: > Thanks, Chris, for your kind reply. Yes, I believe the mailing list is the > most convenient way for Petsc's to communicate with each other. > Hi Jiaoyan, I will fix this. Its taking me a few days because its the end of the semester and things get a little compressed. I should have it done this week. Thanks, Matt > Also, I am wondering if you may happen to use "PetscPartitioner" before? > Have you ever tried to fix Petsc-Fortran problem by yourself before? I > believe Petsc needs to build an interface for Fortran from the C source > code. But, I don't know where Petsc does that. I am just trying to see if I > can fix it by myself. Thank you. > > Jiaoyan > > > ?On 11/27/18, 00:44, "petsc-users on behalf of Klaij, Christiaan via > petsc-users" petsc-users at mcs.anl.gov> wrote: > > Hi Jiaoyan, > > I've been using the fortran interface since 2004. During these > years I only found a handful of things missing. After an email to > this list the developers are usually quite fast in fixing the > problem. > > Chris > > > Date: Mon, 26 Nov 2018 16:41:29 +0000 > From: Jiaoyan Li > To: "petsc-users at mcs.anl.gov" > Subject: [petsc-users] PetscPartitioner is missing for fortran > Message-ID: > Content-Type: text/plain; charset="utf-8" > > Dear Petsc Users: > > I am developing a Fortran code which uses Petsc APIs. But, seems the > interface between Fortran and Petsc is not completed, as replied by Barry. > Is there anyone may have some experience on building the Fortran interface > for Petsc? Any suggestions or comments are highly appreciated. Thank you. > > Have a nice day, > > Jiaoyan > > > ------------------------------------------------ > > On 11/21/18, 19:25, "Smith, Barry F." bsmith at mcs.anl.gov>> wrote: > > > Matt, > > PetscPartitioner is missing from > lib/petsc/conf/bfort-petsc.txt > > Barry > > > > > On Nov 21, 2018, at 3:33 PM, Jiaoyan Li via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Dear Petsc users: > > > > I am trying to use Petsc APIs for Fortran. One problem that I am > facing right now is about the PetscPartitioner, i.e., > > > > #include ?petsc/finclude/petsc.h? > > use petscdmplex > > > > PetscPartitioner :: part > > PetscErrorCode :: ierr > > > > Call PetscPartitionerCreate(PETSC_COMM_WORLD, part, ierr) > > > > But, I got the error message as follows: > > > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [0]PETSC ERROR: Null argument, when expecting valid pointer > > [0]PETSC ERROR: Null Pointer: Parameter # 2 > > [0]PETSC ERROR: See > https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= > for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > > [0]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by > LIJ Wed Nov 21 16:30:35 2018 > > [0]PETSC ERROR: Configure options --download-fblaslapack > --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 > --download-netcdf --download-zlib --download-pnetcdf > > [0]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in > /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [1]PETSC ERROR: Null argument, when expecting valid pointer > > [1]PETSC ERROR: Null Pointer: Parameter # 2 > > [1]PETSC ERROR: See > https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= > for trouble shooting. > > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > > [1]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by > LIJ Wed Nov 21 16:30:35 2018 > > [1]PETSC ERROR: Configure options --download-fblaslapack > --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 > --download-netcdf --download-zlib --download-pnetcdf > > [1]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in > /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > [2]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [2]PETSC ERROR: Null argument, when expecting valid pointer > > [2]PETSC ERROR: Null Pointer: Parameter # 2 > > [2]PETSC ERROR: See > https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= > for trouble shooting. > > [2]PETSC ERROR: Petsc Release Version 3.10.2, unknown > > [2]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by > LIJ Wed Nov 21 16:30:35 2018 > > [2]PETSC ERROR: Configure options --download-fblaslapack > --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 > --download-netcdf --download-zlib --download-pnetcdf > > [2]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in > /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > [3]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [3]PETSC ERROR: Null argument, when expecting valid pointer > > [3]PETSC ERROR: Null Pointer: Parameter # 2 > > [3]PETSC ERROR: See > https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= > for trouble shooting. > > [3]PETSC ERROR: Petsc Release Version 3.10.2, unknown > > [3]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by > LIJ Wed Nov 21 16:30:35 2018 > > [3]PETSC ERROR: Configure options --download-fblaslapack > --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 > --download-netcdf --download-zlib --download-pnetcdf > > [3]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in > /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > > > Is there anyone who may encounter similar problem before? Any > suggestions or commends are highly appreciated. Thank you very much. > > > > Best, > > > > Jiaoyan > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: < > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.mcs.anl.gov_pipermail_petsc-2Dusers_attachments_20181126_0b196ca9_attachment.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=uadiAPfKAyRcJODTiOtF9X7aTXnk7oBkmdranjsZy8o&s=Q_uM9F9ZoKMnIbdEj6lTkgBeXBhS0xUTDZZ1wMCbLds&e= > > > > > dr. ir. Christiaan Klaij | Senior Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | > https://urldefense.proofpoint.com/v2/url?u=http-3A__www.marin.nl&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=uadiAPfKAyRcJODTiOtF9X7aTXnk7oBkmdranjsZy8o&s=X4nF6ffp6hFk5qLA9Q_KN8PjMOU5-rQMhKxQ4j5gIVo&e= > > MARIN news: > https://urldefense.proofpoint.com/v2/url?u=http-3A__www.marin.nl_web_News_News-2Ditems_Maritime-2Dwith-2Da-2DMission-2DTKI-2DEvent-2Dop-2Dhet-2DSS-2DRotterdam-2D30-2Dnovember.htm&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=uadiAPfKAyRcJODTiOtF9X7aTXnk7oBkmdranjsZy8o&s=2K4w9VlCutEzg1zz_NVlsKwk8ECfHknpdmXzF4hyBfo&e= > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jczhang at mcs.anl.gov Tue Nov 27 10:10:22 2018 From: jczhang at mcs.anl.gov (Zhang, Junchao) Date: Tue, 27 Nov 2018 16:10:22 +0000 Subject: [petsc-users] Compile petsc using intel mpi In-Reply-To: References: Message-ID: On Tue, Nov 27, 2018 at 5:25 AM Edoardo alinovi via petsc-users > wrote: Dear users, I have installed intel parallel studio on my workstation and thus I would like to take advantage of intel compiler. Before messing up my installation, have you got some guidelines to survive at this attempt? I have found here in the mailing list the following instructions: --with-cc=icc --with-fc=ifort --with-mpi-include=/path-to-intel --with-mpi-lib=/path-to-intel Are they correct? I think so. But you may also add PETSC_DIR=/path-to-petsc PETSC_ARCH=name-for-this-build Also I have an already existing and clean installation of petsc using openmpi. I would like to retain this installtion since it is working very well and switching between the two somehow. Any tips on this? Use PETSC_ARCH in PETSc configure to differentiate different builds, and export PETSC_ARCH in your environment to select the one you want to use. See more at https://www.mcs.anl.gov/petsc/documentation/installation.html I will never stop to say thank you for your precious support! Edoardo ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica, Universita' degli Studi di Genova, 1, via Montallegro, 16145 Genova, Italy -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Nov 27 10:15:33 2018 From: balay at mcs.anl.gov (Balay, Satish) Date: Tue, 27 Nov 2018 16:15:33 +0000 Subject: [petsc-users] Compile petsc using intel mpi In-Reply-To: References: Message-ID: If you already have MPI from intel - use the MPI compiler wrappers instead of icc etc. ./configure --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort PETSC_ARCH=arch-intel-mpi Satish On Tue, 27 Nov 2018, Zhang, Junchao via petsc-users wrote: > > > On Tue, Nov 27, 2018 at 5:25 AM Edoardo alinovi via petsc-users > wrote: > Dear users, > > I have installed intel parallel studio on my workstation and thus I would like to take advantage of intel compiler. > > Before messing up my installation, have you got some guidelines to survive at this attempt? I have found here in the mailing list the following instructions: > > --with-cc=icc --with-fc=ifort --with-mpi-include=/path-to-intel --with-mpi-lib=/path-to-intel > > Are they correct? > I think so. But you may also add PETSC_DIR=/path-to-petsc PETSC_ARCH=name-for-this-build > > > Also I have an already existing and clean installation of petsc using openmpi. I would like to retain this installtion since it is working very well and switching between the two somehow. Any tips on this? > Use PETSC_ARCH in PETSc configure to differentiate different builds, and export PETSC_ARCH in your environment to select the one you want to use. See more at https://www.mcs.anl.gov/petsc/documentation/installation.html > > > I will never stop to say thank you for your precious support! > > Edoardo > > ------ > > Edoardo Alinovi, Ph.D. > > DICCA, Scuola Politecnica, > Universita' degli Studi di Genova, > 1, via Montallegro, > 16145 Genova, Italy > > > From knepley at gmail.com Tue Nov 27 10:16:09 2018 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 27 Nov 2018 11:16:09 -0500 Subject: [petsc-users] Compile petsc using intel mpi In-Reply-To: References: Message-ID: On Tue, Nov 27, 2018 at 6:25 AM Edoardo alinovi via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear users, > > I have installed intel parallel studio on my workstation and thus I would > like to take advantage of intel compiler. > > Before messing up my installation, have you got some guidelines to survive > at this attempt? I have found here in the mailing list the following > instructions: > > --with-cc=icc --with-fc=ifort --with-mpi-include=/path-to-intel > --with-mpi-lib=/path-to-intel > Yes, this looks right. > Are they correct? > > Also I have an already existing and clean installation of petsc using > openmpi. I would like to retain this installtion since it is working very > well and switching between the two somehow. Any tips on this? > When you do the Intel configuration, use a different name like --PETSC_ARCH=arch-intelps-opt Thanks, Matt > I will never stop to say thank you for your precious support! > > Edoardo > > ------ > > Edoardo Alinovi, Ph.D. > > DICCA, Scuola Politecnica, > Universita' degli Studi di Genova, > 1, via Montallegro, > 16145 Genova, Italy > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jiaoyan.Li at inl.gov Tue Nov 27 10:46:25 2018 From: Jiaoyan.Li at inl.gov (Jiaoyan Li) Date: Tue, 27 Nov 2018 16:46:25 +0000 Subject: [petsc-users] PetscPartitioner is missing for fortran In-Reply-To: References: <45FA1B16-AAA5-4A0F-927C-9786A6431F89@inl.gov> Message-ID: <62159B54-1862-4300-8FE6-2C75E9A86915@inl.gov> Matt, Thanks a lot for your reply, and I look forward to your further info. Best, Jiaoyan From: Matthew Knepley Date: Tuesday, November 27, 2018 at 09:06 To: Jiaoyan Li Cc: "Klaij, Christiaan" , PETSc Subject: Re: [petsc-users] PetscPartitioner is missing for fortran On Tue, Nov 27, 2018 at 10:51 AM Jiaoyan Li via petsc-users > wrote: Thanks, Chris, for your kind reply. Yes, I believe the mailing list is the most convenient way for Petsc's to communicate with each other. Hi Jiaoyan, I will fix this. Its taking me a few days because its the end of the semester and things get a little compressed. I should have it done this week. Thanks, Matt Also, I am wondering if you may happen to use "PetscPartitioner" before? Have you ever tried to fix Petsc-Fortran problem by yourself before? I believe Petsc needs to build an interface for Fortran from the C source code. But, I don't know where Petsc does that. I am just trying to see if I can fix it by myself. Thank you. Jiaoyan On 11/27/18, 00:44, "petsc-users on behalf of Klaij, Christiaan via petsc-users" on behalf of petsc-users at mcs.anl.gov> wrote: Hi Jiaoyan, I've been using the fortran interface since 2004. During these years I only found a handful of things missing. After an email to this list the developers are usually quite fast in fixing the problem. Chris Date: Mon, 26 Nov 2018 16:41:29 +0000 From: Jiaoyan Li > To: "petsc-users at mcs.anl.gov" > Subject: [petsc-users] PetscPartitioner is missing for fortran Message-ID: > Content-Type: text/plain; charset="utf-8" Dear Petsc Users: I am developing a Fortran code which uses Petsc APIs. But, seems the interface between Fortran and Petsc is not completed, as replied by Barry. Is there anyone may have some experience on building the Fortran interface for Petsc? Any suggestions or comments are highly appreciated. Thank you. Have a nice day, Jiaoyan ------------------------------------------------ On 11/21/18, 19:25, "Smith, Barry F." >> wrote: Matt, PetscPartitioner is missing from lib/petsc/conf/bfort-petsc.txt Barry > On Nov 21, 2018, at 3:33 PM, Jiaoyan Li via petsc-users >> wrote: > > Dear Petsc users: > > I am trying to use Petsc APIs for Fortran. One problem that I am facing right now is about the PetscPartitioner, i.e., > > #include ?petsc/finclude/petsc.h? > use petscdmplex > > PetscPartitioner :: part > PetscErrorCode :: ierr > > Call PetscPartitionerCreate(PETSC_COMM_WORLD, part, ierr) > > But, I got the error message as follows: > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Null argument, when expecting valid pointer > [0]PETSC ERROR: Null Pointer: Parameter # 2 > [0]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [0]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [0]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Null argument, when expecting valid pointer > [1]PETSC ERROR: Null Pointer: Parameter # 2 > [1]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [1]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [1]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [1]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [2]PETSC ERROR: Null argument, when expecting valid pointer > [2]PETSC ERROR: Null Pointer: Parameter # 2 > [2]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [2]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [2]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [2]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [2]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [3]PETSC ERROR: Null argument, when expecting valid pointer > [3]PETSC ERROR: Null Pointer: Parameter # 2 > [3]PETSC ERROR: See https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=1lomdbavAvxQQpe-IZtEv3xEovYeZ9lxbOzN-sE8CUQ&s=BYCTTDqEflIdLwRAkF4txknqLg0jeyOcdodQkfHj-TA&e= for trouble shooting. > [3]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [3]PETSC ERROR: ./htm3d on a arch-linux2-c-opt named fn607018 by LIJ Wed Nov 21 16:30:35 2018 > [3]PETSC ERROR: Configure options --download-fblaslapack --with-mpi-dir=/opt/mpitch-3.2.1 -download-exodusii --download-hdf5 --download-netcdf --download-zlib --download-pnetcdf > [3]PETSC ERROR: #1 PetscPartitionerCreate() line 601 in /home/lij/packages/petsc/src/dm/impls/plex/plexpartition.c > > Is there anyone who may encounter similar problem before? Any suggestions or commends are highly appreciated. Thank you very much. > > Best, > > Jiaoyan -------------- next part -------------- An HTML attachment was scrubbed... URL: dr. ir. Christiaan Klaij | Senior Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | https://urldefense.proofpoint.com/v2/url?u=http-3A__www.marin.nl&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=uadiAPfKAyRcJODTiOtF9X7aTXnk7oBkmdranjsZy8o&s=X4nF6ffp6hFk5qLA9Q_KN8PjMOU5-rQMhKxQ4j5gIVo&e= MARIN news: https://urldefense.proofpoint.com/v2/url?u=http-3A__www.marin.nl_web_News_News-2Ditems_Maritime-2Dwith-2Da-2DMission-2DTKI-2DEvent-2Dop-2Dhet-2DSS-2DRotterdam-2D30-2Dnovember.htm&d=DwIGaQ&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB__aEkJFOKJFd00&r=5MMpjBrVOPpVGfIH9op1r4nz1k4YC8LDRnpo_HwMgZU&m=uadiAPfKAyRcJODTiOtF9X7aTXnk7oBkmdranjsZy8o&s=2K4w9VlCutEzg1zz_NVlsKwk8ECfHknpdmXzF4hyBfo&e= -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Nov 27 10:57:38 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Tue, 27 Nov 2018 16:57:38 +0000 Subject: [petsc-users] Example 23 of ksp problems In-Reply-To: References: Message-ID: <3EB7510E-0E3A-4413-8E36-C23855E140B1@anl.gov> This is an ad hoc way of checking that the error from the linear solve is "reasonable". Note that PETSC_MACHINE_EPSILON depends on the precision of the floating point used so for more precise floating point (in going from half precision to quad precision) we expect the error to be smaller (depending on the machine epsilon). The 1000 is an arbitrary number, it could be 100 or 10,000 or anything in between could be reasonable. Note that using a 1.0 is probably not reasonable because it is not reasonable to expect the error to be only a machine epsilon away from the "exact" answer. Barry If you are not familiar with the term machine epsilon you can google it. > On Nov 26, 2018, at 9:30 PM, Fazlul Huq via petsc-users wrote: > > Hello PETSc developers, > > I went through the ex23.c of ksp section attached herewith but I don't understand the following part: > *********************************************** > tol=1000.*PETSC_MACHINE_EPSILON > ************************************************ > and, > ************************************************ > ierr = VecAXPY(x,-1.0,u);CHKERRQ(ierr); > ierr = VecNorm(x,NORM_2,&norm);CHKERRQ(ierr); > ierr = KSPGetIterationNumber(ksp,&its);CHKERRQ(ierr); > if (norm > tol) { > ierr = PetscPrintf(PETSC_COMM_WORLD,"Norm of error %g, Iterations %D\n", (double)norm,its);CHKERRQ(ierr); > } > ************************************************ > I don't understand what is "tol" here and "*PETSC_MACHINE_EPSILON"? > The if condition is also not clear to me. > > Thanks. > Sincerely, > Huq > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > From sajidsyed2021 at u.northwestern.edu Tue Nov 27 15:25:32 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Tue, 27 Nov 2018 15:25:32 -0600 Subject: [petsc-users] Some clarifications about TS ex3.c Message-ID: Hi, I wanted to ask a few questions about the TS example ex3.c when using the ifunc option 190: Mat J; 192: RHSMatrixHeat(ts,0.0,u,A,A,&appctx);193: MatDuplicate (A,MAT_DO_NOT_COPY_VALUES ,&J);194: TSSetIFunction (ts,NULL,IFunctionHeat,&appctx);195: TSSetIJacobian (ts,J,J,IJacobianHeat,&appctx);196: MatDestroy (&J); 198: PetscObjectReference ((PetscObject )A);199: appctx.A = A;200: appctx.oshift = PETSC_MIN_REAL;201: } 1) When TSSetIJacobian is called on line 195, J is empty. Why is J deleted after calling the routing but before TSSolve is called ? (Isn't J supposed to hold the jacobian ? ) 2) The shift is not calculated before calling TSSetIJacobian. Does this mean that the TSSetIJacobian takes care of the shift ? 3) What is happening on line 198 ? Thank You, Sajid Ali Applied Physics Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Nov 27 15:49:49 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Tue, 27 Nov 2018 21:49:49 +0000 Subject: [petsc-users] Fortran interface of some petsc routines seem to be missing In-Reply-To: References: Message-ID: PETSc uses the Sowing packages bfort tool for automatically generating "Fortran stub functions" and interface definitions. There are certain C functions that bfort cannot handling including (at least) 1) functions with character string arguments 2) functions with function pointer arguments (such as SNESSetFunction()) 3) functions that support polymorphism between array and non-array arguments (for example VecSetValues() where the ix and y arguments can be either scalars or arrays). 4) functions that can take a NULL argument. Functions that bfort can automatically generate the stub and the interface definition are marked with /*@ at the beginning of the comment above the function (see for example src/ksp/ksp/interface/itcreate.c KSPCreate()). If bfort cannot work then the comment begins with /*@C (see for example src/sys/objects/pinit.c PetscInitiialize()). It is our intention to provide manual Fortran stubs and interface for functions that bfort cannot handle. As you note we do not currently provide all the interface definitions, many even basic ones are missing. This is because we wrote the manual stub functions many, many years ago when we wrote the C functions but only in the past couple of years have we been providing interfaces and no one went back and added all the required manual interface definitions. I did not know about the gfortran option -Wimplicit-interface; this is a great tool for finding missing interfaces. Thanks for letting us know about it. The manually generated stub functions are stored in files in the subdirectories ftn-custom from where the C function is defined. The filename is the C file name prepended with a Z and appended with an f. For example snes.c becomes ftn-custom/zsnesf.c The manually generated interface definitions should go into the file src/XXX/f90-mod/petscYYY.h90 for example the interfaces for VecSetValues are in src/vec/f90-mod/petscvec.h90 If you are ambitious and know about pull requests we'd be very happy to accept a pull request that provided the missing manual interface definitions. If you are less ambitious you could send us the interface definitions that you create and we'll add them ourselves to the PETSc repository. Or eventually we'll provide more and more of the interface definitions as time permits. Barry > On Nov 27, 2018, at 9:28 AM, Natacha BEREUX via petsc-users wrote: > > Hello, > I work on a Fortran software that uses PETSc for linear solvers. Therefore, we have a PETSc interface to convert our matrices to PETSc Mat. > I have noticed several compiler warnings ( I use gfortran with -Wimplicit-interface) during compilation. The warnings point out that some (but not all) fortran interfaces are missing. > > The behaviour is the same when I compile a PETSC example. Below is src/vec/vec/examples/tutorials/ex9f.F compiled with gfortran. > > mpif90 -c -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -Wimplicit-interface -g -O -I/home/H03755/Librairies/petsc-3.10.1/include -I/home/H03755/Librairies/petsc-3.10.1/linux-opt-mumps-ml-hypre-superlu/include -I/home/H03755/dev/codeaster-prerequisites/v14/prerequisites/Mumps-512_consortium_aster3/MPI/include -I/home/H03755/local/petsc/petsc-3.10.1/include -o ex9f.o ex9f.F90 > ex9f.F90:34.53: > > call PetscInitialize(PETSC_NULL_CHARACTER,ierr) > 1 > Warning: Procedure 'petscinitialize' called with an implicit interface at (1) > ex9f.F90:42.91: > > if (size .ne. 2) then; call PetscError(PETSC_COMM_WORLD,1,0,'Requires 2 processors'); call MPIU_Abort(PETSC_COMM_WORLD,1); endif > 1 > Warning: Procedure 'petscerror' called with an implicit interface at (1) > ex9f.F90:42.128: > > if (size .ne. 2) then; call PetscError(PETSC_COMM_WORLD,1,0,'Requires 2 processors'); call MPIU_Abort(PETSC_COMM_WORLD,1); endif > 1 > Warning: Procedure 'mpiu_abort' called with an implicit interface at (1) > ex9f.F90:81.56: > > & PETSC_DECIDE,nghost,ifrom,tarray,gxs,ierr) > 1 > Warning: Procedure 'veccreateghostwitharray' called with an implicit interface at (1) > ex9f.F90:99.53: > > call VecGetOwnershipRange(gx,rstart,rend,ierr) > 1 > Warning: Procedure 'vecgetownershiprange' called with an implicit interface at (1) > ex9f.F90:115.93: > > call PetscViewerGetSubViewer(PETSC_VIEWER_STDOUT_WORLD,PETSC_COMM_SELF,subviewer,ierr) > 1 > Warning: Procedure 'petscviewergetsubviewer' called with an implicit interface at (1) > ex9f.F90:117.97: > > call PetscViewerRestoreSubViewer(PETSC_VIEWER_STDOUT_WORLD,PETSC_COMM_SELF,subviewer,ierr) > 1 > Warning: Procedure 'petscviewerrestoresubviewer' called with an implicit interface at (1) > ex9f.F90:121.31: > > call PetscFinalize(ierr) > 1 > Warning: Procedure 'petscfinalize' called with an implicit interface at (1) > > > Why does the compiler complain ? > Did I miss something when I compiled PETSc library ? Is there a way to properly generate all the Fortran interfaces in the compiled library ? > Or is it normal that PETSc only generates some interfaces but not all ? > In this case, is there a way to know which interfaces are explicitly and automatically defined in PETSc library ? So that I can provide manually the missing ones in my code ? > > Thanks a lot for your help ! > Best regards, > Natacha > > > > From bsmith at mcs.anl.gov Tue Nov 27 16:05:33 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Tue, 27 Nov 2018 22:05:33 +0000 Subject: [petsc-users] Some clarifications about TS ex3.c In-Reply-To: References: Message-ID: <37F3B68A-21BC-4CED-9394-4CD6FC8D4844@anl.gov> PETSc uses reference counting to track when an object is no longer needed and can be freed. Thus XXXDestroy(&xxx) may not actually destroy an object it just decreases the reference count by 1 (and destroys the object if the reference count is 0). 1) TSSetIJacobian() increases the reference count of the matrix passed in. Hence the user must destroy the J matrix at some time. Yes it is confusing to call the destroy here, usually we call it at the end of the program. 2) TS computes the shift as needed inside the particular ODE integrator being used; it is not set by the user. 3) This code is weird (not wrong, just weird). If you look at the end of the function you will see: ierr = MatDestroy(&A);CHKERRQ(ierr); ierr = MatDestroy(&appctx.A);CHKERRQ(ierr); that is the same matrix is destroyed twice. The PetscObjectReference() increases the reference count to two hence the object needs to be destroyed twice before it is actually destroyed. Note you don't need to do this. You could skip the PetscObjectReference() in the code and remove one of the two MatDestroy() at the end of the code. Barry > On Nov 27, 2018, at 3:25 PM, Sajid Ali via petsc-users wrote: > > Hi, > > I wanted to ask a few questions about the TS example ex3.c when using the ifunc option > > 190: Mat > J; > > > 192: > RHSMatrixHeat(ts,0.0,u,A,A,&appctx); > > 193: MatDuplicate(A,MAT_DO_NOT_COPY_VALUES > ,&J); > > 194: TSSetIFunction > (ts,NULL,IFunctionHeat,&appctx); > > 195: TSSetIJacobian > (ts,J,J,IJacobianHeat,&appctx); > > 196: MatDestroy > (&J); > > > 198: PetscObjectReference((PetscObject > )A); > > 199: > appctx.A = A; > > 200: > appctx.oshift = PETSC_MIN_REAL; > > 201: } > > > 1) When TSSetIJacobian is called on line 195, J is empty. Why is J deleted after calling the routing but before TSSolve is called ? (Isn't J supposed to hold the jacobian ? ) > > 2) The shift is not calculated before calling TSSetIJacobian. Does this mean that the TSSetIJacobian takes care of the shift ? > > 3) What is happening on line 198 ? > > > Thank You, > Sajid Ali > Applied Physics > Northwestern University From info1 at seminar-gos-zakaz3.ru Tue Nov 27 23:02:56 2018 From: info1 at seminar-gos-zakaz3.ru (Aurora) Date: Tue, 27 Nov 2018 21:02:56 -0800 (MSK) Subject: [petsc-users] I liked you. Message-ID: https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz What's up cute. I?ve just, only looked through your pictures. You?re very attractive. I am so, very tired today and I want 2 offer you talking. look at me here -------------- next part -------------- An HTML attachment was scrubbed... URL: From tempohoper at gmail.com Wed Nov 28 09:26:33 2018 From: tempohoper at gmail.com (Sal Am) Date: Wed, 28 Nov 2018 15:26:33 +0000 Subject: [petsc-users] Solving complex linear sparse matrix in parallel + external library In-Reply-To: References: Message-ID: Thank you indeed --download-mpich and using PETSC_ARCH/bin/mpiexec seems to work. Now I am wondering about the other problem namely getting the residual, is the residual only computed when using iterative solvers? Cause using richardson or gmres with 1 iteration while using MUMPS prints me nothing. On Tue, Nov 27, 2018 at 10:53 AM Matthew Knepley wrote: > On Tue, Nov 27, 2018 at 4:29 AM Sal Am wrote: > >> This can happen if you use an 'mpiexec' which is from a different MPI >>> than the one you compiled PETSc with. >>> >> >> that is odd, I tried removing the --download-mpich from the config and >> tried --with-mpi=1 (which should be default anyways) and retried it with >> --with-mpich=1. >> > > Your MPI is still broken. Some default MPIs, like those installed with > Apple, are broken. If you are on Apple > I would recommend using --download-mpich, but make sure you use > $PETSC_DIR/$PETSC_ARCH/bin/mpiexec. > > If you reconfigure, you either have to specify a different --PETSC_ARCH, > or delete the $PETSC_DIR/$PETSC_ARCH completely. Otherwise, you can get bad > interaction with the libraries already sitting there. > > Thanks, > > Matt > > >> Current reconfig file: >> '--download-mpich', >> '--download-mumps', >> '--download-scalapack', >> '--download-superlu_dist', >> '--with-cc=gcc', >> '--with-clanguage=cxx', >> '--with-cxx=g++', >> '--with-debugging=no', >> '--with-fc=gfortran', >> '--with-mpi=1', >> '--with-mpich=1', >> '--with-scalar-type=complex', >> 'PETSC_ARCH=linux-opt' >> Still does not work though, I tried executing ex11 in >> ksp/ksp/example/tutorial/ex11 which should solve a linear system in >> parallel by running mpiexec -n 2 but it prints out >> Mat Object: 1 MPI processes >> ... >> .. >> twice. >> What am I missing? >> >> Why would you want a minimum iterations? Why not set a tolerance and a >>> max? What would you achieve with a minimum? >>> >> >> I thought PETSc might not be iterating far enough, but after having had a >> look at KSPSetTolerances it makes more sense. >> >> Instead of 'preonly', where you do not want a residual, use 'richardson' >>> or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >>> >> >> So I tried that by executing: mpiexec -n 4 ./test -ksp_type richardson >> -pc_type lu -pc_factor_mat_solver_type mumps -ksp_max_it 1 >> on the command line. However, nothing prints out on the terminal (aside >> from PetscPrintf if I have them enabled). >> >> Thank you. >> >> On Fri, Nov 16, 2018 at 12:00 PM Matthew Knepley >> wrote: >> >>> On Fri, Nov 16, 2018 at 4:23 AM Sal Am via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> >>>> Hi, >>>> >>>> I have a few questions: >>>> >>>> 1. The following issue/misunderstanding: >>>> My code reads in two files one PETSc vector and one PETSc matrix (b and >>>> A from Ax=b, size ~65000x65000). >>>> and then calls KSP solver to solve it by running the following in the >>>> terminal: >>>> >>>> mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu >>>> -pc_factor_mat_solver mumps >>>> >>>> Now mumps is supposed to work in parallel and complex, but the code is >>>> not solved in parallel it seems. It just prints the result twice. Adding >>>> -log_view gives me >>>> >>>> "./SolveSys on a linux-opt named F8434 with 1 processor..." printed >>>> twice. >>>> >>> >>> This can happen if you use an 'mpiexec' which is from a different MPI >>> than the one you compiled PETSc with. >>> >>> >>>> 2. Using iterative solvers, I am having difficulty getting convergence. >>>> I found that there is a way to set the maximum number of iterations, but is >>>> there a minimum I can increase? >>>> >>> >>> Why would you want a minimum iterations? Why not set a tolerance and a >>> max? What would you achieve with a minimum? >>> >>> >>>> 3. The residual is not computed when using direct external solvers. >>>> What is proper PETSc way of doing this? >>>> >>> >>> Instead of 'preonly', where you do not want a residual, use 'richardson' >>> or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> -----------------------------------------------------The >>>> code--------------------------------------- >>>> #include >>>> #include >>>> int main(int argc,char **args) >>>> { >>>> Vec x,b; /* approx solution, RHS */ >>>> Mat A; /* linear system matrix */ >>>> KSP ksp; /* linear solver context */ >>>> PetscReal norm; /* norm of solution error */ >>>> PC pc; >>>> PetscMPIInt rank, size; >>>> PetscViewer viewer; >>>> PetscInt its, i; >>>> PetscErrorCode ierr; >>>> PetscScalar *xa; >>>> PetscBool flg = PETSC_FALSE; >>>> >>>> ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return >>>> ierr; >>>> MPI_Comm_rank(PETSC_COMM_WORLD,&rank); >>>> MPI_Comm_size(PETSC_COMM_WORLD,&size); >>>> >>>> #if !defined(PETSC_USE_COMPLEX) >>>> SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex numbers"); >>>> #endif >>>> /* >>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>> Compute the matrix and right-hand-side vector that define >>>> the linear system, Ax = b. >>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>> - >>>> */ >>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from >>>> Vector_b.dat ...\n");CHKERRQ(ierr); >>>> ierr = >>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>>> ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr); >>>> ierr = VecLoad(b,viewer); CHKERRQ(ierr); >>>> >>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from >>>> Matrix_A.dat ...\n");CHKERRQ(ierr); >>>> ierr = >>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>>> ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); >>>> ierr = MatLoad(A,viewer);CHKERRQ(ierr); >>>> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); >>>> >>>> ierr = VecDuplicate(b,&x);CHKERRQ(ierr); >>>> >>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>> Create the linear solver and set various options >>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>> - >>>> */ >>>> PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n"); >>>> ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); >>>> >>>> >>>> PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n"); >>>> ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr); >>>> >>>> ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); >>>> ierr = PCSetType(pc, PCLU);CHKERRQ(ierr); >>>> ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); >>>> >>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>> Solve the linear system >>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>> - */ >>>> ierr = KSPSetUp(ksp);CHKERRQ(ierr); >>>> ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr); >>>> ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); >>>> PetscPrintf(PETSC_COMM_WORLD, "Solved"); >>>> >>>> /* >>>> Free work space. All PETSc objects should be destroyed when they >>>> are no longer needed. >>>> */ >>>> ierr = KSPDestroy(&ksp);CHKERRQ(ierr); >>>> ierr = VecDestroy(&x);CHKERRQ(ierr); >>>> ierr = VecDestroy(&b);CHKERRQ(ierr); >>>> ierr = MatDestroy(&A);CHKERRQ(ierr); >>>> ierr = PetscFinalize(); >>>> return ierr; >>>> } >>>> >>>> >>>> Kind regards, >>>> Sal >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Nov 28 09:34:32 2018 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 28 Nov 2018 10:34:32 -0500 Subject: [petsc-users] Solving complex linear sparse matrix in parallel + external library In-Reply-To: References: Message-ID: On Wed, Nov 28, 2018 at 10:27 AM Sal Am via petsc-users < petsc-users at mcs.anl.gov> wrote: > Thank you indeed --download-mpich and using PETSC_ARCH/bin/mpiexec seems > to work. > > Now I am wondering about the other problem namely getting the residual, is > the residual only computed when using iterative solvers? Cause using > richardson or gmres with 1 iteration while using MUMPS prints me nothing. > This should work. Please send us all the output and your arguments, etc. You can also add -ksp_view to print info about the solver configuration. > > > > On Tue, Nov 27, 2018 at 10:53 AM Matthew Knepley > wrote: > >> On Tue, Nov 27, 2018 at 4:29 AM Sal Am wrote: >> >>> This can happen if you use an 'mpiexec' which is from a different MPI >>>> than the one you compiled PETSc with. >>>> >>> >>> that is odd, I tried removing the --download-mpich from the config and >>> tried --with-mpi=1 (which should be default anyways) and retried it with >>> --with-mpich=1. >>> >> >> Your MPI is still broken. Some default MPIs, like those installed with >> Apple, are broken. If you are on Apple >> I would recommend using --download-mpich, but make sure you use >> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec. >> >> If you reconfigure, you either have to specify a different --PETSC_ARCH, >> or delete the $PETSC_DIR/$PETSC_ARCH completely. Otherwise, you can get bad >> interaction with the libraries already sitting there. >> >> Thanks, >> >> Matt >> >> >>> Current reconfig file: >>> '--download-mpich', >>> '--download-mumps', >>> '--download-scalapack', >>> '--download-superlu_dist', >>> '--with-cc=gcc', >>> '--with-clanguage=cxx', >>> '--with-cxx=g++', >>> '--with-debugging=no', >>> '--with-fc=gfortran', >>> '--with-mpi=1', >>> '--with-mpich=1', >>> '--with-scalar-type=complex', >>> 'PETSC_ARCH=linux-opt' >>> Still does not work though, I tried executing ex11 in >>> ksp/ksp/example/tutorial/ex11 which should solve a linear system in >>> parallel by running mpiexec -n 2 but it prints out >>> Mat Object: 1 MPI processes >>> ... >>> .. >>> twice. >>> What am I missing? >>> >>> Why would you want a minimum iterations? Why not set a tolerance and a >>>> max? What would you achieve with a minimum? >>>> >>> >>> I thought PETSc might not be iterating far enough, but after having had >>> a look at KSPSetTolerances it makes more sense. >>> >>> Instead of 'preonly', where you do not want a residual, use 'richardson' >>>> or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >>>> >>> >>> So I tried that by executing: mpiexec -n 4 ./test -ksp_type richardson >>> -pc_type lu -pc_factor_mat_solver_type mumps -ksp_max_it 1 >>> on the command line. However, nothing prints out on the terminal (aside >>> from PetscPrintf if I have them enabled). >>> >>> Thank you. >>> >>> On Fri, Nov 16, 2018 at 12:00 PM Matthew Knepley >>> wrote: >>> >>>> On Fri, Nov 16, 2018 at 4:23 AM Sal Am via petsc-users < >>>> petsc-users at mcs.anl.gov> wrote: >>>> >>>>> Hi, >>>>> >>>>> I have a few questions: >>>>> >>>>> 1. The following issue/misunderstanding: >>>>> My code reads in two files one PETSc vector and one PETSc matrix (b >>>>> and A from Ax=b, size ~65000x65000). >>>>> and then calls KSP solver to solve it by running the following in the >>>>> terminal: >>>>> >>>>> mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu >>>>> -pc_factor_mat_solver mumps >>>>> >>>>> Now mumps is supposed to work in parallel and complex, but the code is >>>>> not solved in parallel it seems. It just prints the result twice. Adding >>>>> -log_view gives me >>>>> >>>>> "./SolveSys on a linux-opt named F8434 with 1 processor..." printed >>>>> twice. >>>>> >>>> >>>> This can happen if you use an 'mpiexec' which is from a different MPI >>>> than the one you compiled PETSc with. >>>> >>>> >>>>> 2. Using iterative solvers, I am having difficulty getting >>>>> convergence. I found that there is a way to set the maximum number of >>>>> iterations, but is there a minimum I can increase? >>>>> >>>> >>>> Why would you want a minimum iterations? Why not set a tolerance and a >>>> max? What would you achieve with a minimum? >>>> >>>> >>>>> 3. The residual is not computed when using direct external solvers. >>>>> What is proper PETSc way of doing this? >>>>> >>>> >>>> Instead of 'preonly', where you do not want a residual, use >>>> 'richardson' or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> -----------------------------------------------------The >>>>> code--------------------------------------- >>>>> #include >>>>> #include >>>>> int main(int argc,char **args) >>>>> { >>>>> Vec x,b; /* approx solution, RHS */ >>>>> Mat A; /* linear system matrix */ >>>>> KSP ksp; /* linear solver context */ >>>>> PetscReal norm; /* norm of solution error */ >>>>> PC pc; >>>>> PetscMPIInt rank, size; >>>>> PetscViewer viewer; >>>>> PetscInt its, i; >>>>> PetscErrorCode ierr; >>>>> PetscScalar *xa; >>>>> PetscBool flg = PETSC_FALSE; >>>>> >>>>> ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return >>>>> ierr; >>>>> MPI_Comm_rank(PETSC_COMM_WORLD,&rank); >>>>> MPI_Comm_size(PETSC_COMM_WORLD,&size); >>>>> >>>>> #if !defined(PETSC_USE_COMPLEX) >>>>> SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex >>>>> numbers"); >>>>> #endif >>>>> /* >>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> Compute the matrix and right-hand-side vector that define >>>>> the linear system, Ax = b. >>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - >>>>> */ >>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from >>>>> Vector_b.dat ...\n");CHKERRQ(ierr); >>>>> ierr = >>>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>>>> ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr); >>>>> ierr = VecLoad(b,viewer); CHKERRQ(ierr); >>>>> >>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from >>>>> Matrix_A.dat ...\n");CHKERRQ(ierr); >>>>> ierr = >>>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>>>> ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); >>>>> ierr = MatLoad(A,viewer);CHKERRQ(ierr); >>>>> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); >>>>> >>>>> ierr = VecDuplicate(b,&x);CHKERRQ(ierr); >>>>> >>>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - >>>>> Create the linear solver and set various options >>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - >>>>> */ >>>>> PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n"); >>>>> ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); >>>>> >>>>> >>>>> PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n"); >>>>> ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr); >>>>> >>>>> ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); >>>>> ierr = PCSetType(pc, PCLU);CHKERRQ(ierr); >>>>> ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); >>>>> >>>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - >>>>> Solve the linear system >>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - */ >>>>> ierr = KSPSetUp(ksp);CHKERRQ(ierr); >>>>> ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr); >>>>> ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); >>>>> PetscPrintf(PETSC_COMM_WORLD, "Solved"); >>>>> >>>>> /* >>>>> Free work space. All PETSc objects should be destroyed when they >>>>> are no longer needed. >>>>> */ >>>>> ierr = KSPDestroy(&ksp);CHKERRQ(ierr); >>>>> ierr = VecDestroy(&x);CHKERRQ(ierr); >>>>> ierr = VecDestroy(&b);CHKERRQ(ierr); >>>>> ierr = MatDestroy(&A);CHKERRQ(ierr); >>>>> ierr = PetscFinalize(); >>>>> return ierr; >>>>> } >>>>> >>>>> >>>>> Kind regards, >>>>> Sal >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 28 09:59:07 2018 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Nov 2018 10:59:07 -0500 Subject: [petsc-users] Solving complex linear sparse matrix in parallel + external library In-Reply-To: References: Message-ID: On Wed, Nov 28, 2018 at 10:26 AM Sal Am wrote: > Thank you indeed --download-mpich and using PETSC_ARCH/bin/mpiexec seems > to work. > > Now I am wondering about the other problem namely getting the residual, is > the residual only computed when using iterative solvers? Cause using > richardson or gmres with 1 iteration while using MUMPS prints me nothing. > We had an optimization which did not print the residual with richardson, since it is used often inside of other solvers where you do not want this. Are you sure that the residual does not print with GMRES? Thanks, Matt > On Tue, Nov 27, 2018 at 10:53 AM Matthew Knepley > wrote: > >> On Tue, Nov 27, 2018 at 4:29 AM Sal Am wrote: >> >>> This can happen if you use an 'mpiexec' which is from a different MPI >>>> than the one you compiled PETSc with. >>>> >>> >>> that is odd, I tried removing the --download-mpich from the config and >>> tried --with-mpi=1 (which should be default anyways) and retried it with >>> --with-mpich=1. >>> >> >> Your MPI is still broken. Some default MPIs, like those installed with >> Apple, are broken. If you are on Apple >> I would recommend using --download-mpich, but make sure you use >> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec. >> >> If you reconfigure, you either have to specify a different --PETSC_ARCH, >> or delete the $PETSC_DIR/$PETSC_ARCH completely. Otherwise, you can get bad >> interaction with the libraries already sitting there. >> >> Thanks, >> >> Matt >> >> >>> Current reconfig file: >>> '--download-mpich', >>> '--download-mumps', >>> '--download-scalapack', >>> '--download-superlu_dist', >>> '--with-cc=gcc', >>> '--with-clanguage=cxx', >>> '--with-cxx=g++', >>> '--with-debugging=no', >>> '--with-fc=gfortran', >>> '--with-mpi=1', >>> '--with-mpich=1', >>> '--with-scalar-type=complex', >>> 'PETSC_ARCH=linux-opt' >>> Still does not work though, I tried executing ex11 in >>> ksp/ksp/example/tutorial/ex11 which should solve a linear system in >>> parallel by running mpiexec -n 2 but it prints out >>> Mat Object: 1 MPI processes >>> ... >>> .. >>> twice. >>> What am I missing? >>> >>> Why would you want a minimum iterations? Why not set a tolerance and a >>>> max? What would you achieve with a minimum? >>>> >>> >>> I thought PETSc might not be iterating far enough, but after having had >>> a look at KSPSetTolerances it makes more sense. >>> >>> Instead of 'preonly', where you do not want a residual, use 'richardson' >>>> or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >>>> >>> >>> So I tried that by executing: mpiexec -n 4 ./test -ksp_type richardson >>> -pc_type lu -pc_factor_mat_solver_type mumps -ksp_max_it 1 >>> on the command line. However, nothing prints out on the terminal (aside >>> from PetscPrintf if I have them enabled). >>> >>> Thank you. >>> >>> On Fri, Nov 16, 2018 at 12:00 PM Matthew Knepley >>> wrote: >>> >>>> On Fri, Nov 16, 2018 at 4:23 AM Sal Am via petsc-users < >>>> petsc-users at mcs.anl.gov> wrote: >>>> >>>>> Hi, >>>>> >>>>> I have a few questions: >>>>> >>>>> 1. The following issue/misunderstanding: >>>>> My code reads in two files one PETSc vector and one PETSc matrix (b >>>>> and A from Ax=b, size ~65000x65000). >>>>> and then calls KSP solver to solve it by running the following in the >>>>> terminal: >>>>> >>>>> mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu >>>>> -pc_factor_mat_solver mumps >>>>> >>>>> Now mumps is supposed to work in parallel and complex, but the code is >>>>> not solved in parallel it seems. It just prints the result twice. Adding >>>>> -log_view gives me >>>>> >>>>> "./SolveSys on a linux-opt named F8434 with 1 processor..." printed >>>>> twice. >>>>> >>>> >>>> This can happen if you use an 'mpiexec' which is from a different MPI >>>> than the one you compiled PETSc with. >>>> >>>> >>>>> 2. Using iterative solvers, I am having difficulty getting >>>>> convergence. I found that there is a way to set the maximum number of >>>>> iterations, but is there a minimum I can increase? >>>>> >>>> >>>> Why would you want a minimum iterations? Why not set a tolerance and a >>>> max? What would you achieve with a minimum? >>>> >>>> >>>>> 3. The residual is not computed when using direct external solvers. >>>>> What is proper PETSc way of doing this? >>>>> >>>> >>>> Instead of 'preonly', where you do not want a residual, use >>>> 'richardson' or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> -----------------------------------------------------The >>>>> code--------------------------------------- >>>>> #include >>>>> #include >>>>> int main(int argc,char **args) >>>>> { >>>>> Vec x,b; /* approx solution, RHS */ >>>>> Mat A; /* linear system matrix */ >>>>> KSP ksp; /* linear solver context */ >>>>> PetscReal norm; /* norm of solution error */ >>>>> PC pc; >>>>> PetscMPIInt rank, size; >>>>> PetscViewer viewer; >>>>> PetscInt its, i; >>>>> PetscErrorCode ierr; >>>>> PetscScalar *xa; >>>>> PetscBool flg = PETSC_FALSE; >>>>> >>>>> ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return >>>>> ierr; >>>>> MPI_Comm_rank(PETSC_COMM_WORLD,&rank); >>>>> MPI_Comm_size(PETSC_COMM_WORLD,&size); >>>>> >>>>> #if !defined(PETSC_USE_COMPLEX) >>>>> SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex >>>>> numbers"); >>>>> #endif >>>>> /* >>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> Compute the matrix and right-hand-side vector that define >>>>> the linear system, Ax = b. >>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - >>>>> */ >>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from >>>>> Vector_b.dat ...\n");CHKERRQ(ierr); >>>>> ierr = >>>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>>>> ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr); >>>>> ierr = VecLoad(b,viewer); CHKERRQ(ierr); >>>>> >>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from >>>>> Matrix_A.dat ...\n");CHKERRQ(ierr); >>>>> ierr = >>>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>>>> ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); >>>>> ierr = MatLoad(A,viewer);CHKERRQ(ierr); >>>>> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); >>>>> >>>>> ierr = VecDuplicate(b,&x);CHKERRQ(ierr); >>>>> >>>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - >>>>> Create the linear solver and set various options >>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - >>>>> */ >>>>> PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n"); >>>>> ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); >>>>> >>>>> >>>>> PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n"); >>>>> ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr); >>>>> >>>>> ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); >>>>> ierr = PCSetType(pc, PCLU);CHKERRQ(ierr); >>>>> ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); >>>>> >>>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - >>>>> Solve the linear system >>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>> - */ >>>>> ierr = KSPSetUp(ksp);CHKERRQ(ierr); >>>>> ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr); >>>>> ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); >>>>> PetscPrintf(PETSC_COMM_WORLD, "Solved"); >>>>> >>>>> /* >>>>> Free work space. All PETSc objects should be destroyed when they >>>>> are no longer needed. >>>>> */ >>>>> ierr = KSPDestroy(&ksp);CHKERRQ(ierr); >>>>> ierr = VecDestroy(&x);CHKERRQ(ierr); >>>>> ierr = VecDestroy(&b);CHKERRQ(ierr); >>>>> ierr = MatDestroy(&A);CHKERRQ(ierr); >>>>> ierr = PetscFinalize(); >>>>> return ierr; >>>>> } >>>>> >>>>> >>>>> Kind regards, >>>>> Sal >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailinglists at xgm.de Wed Nov 28 11:06:07 2018 From: mailinglists at xgm.de (Florian Lindner) Date: Wed, 28 Nov 2018 18:06:07 +0100 Subject: [petsc-users] IS Invert Not Permutation In-Reply-To: References: <63cd6b99-5b2a-1b61-a006-d9b96bde9830@xgm.de> Message-ID: <67b84dd1-1e2a-a06b-c5d7-d4d5697ef1bc@xgm.de> Hey, thanks for your quick reply! As far as I understand how MatSetLocalToGlobalMapping works, there is no way to create such a mapping that can be used for MatSetValuesLocal? What I need is basically MatSetValuesLocal row/col argument = 3 / 4 / 6 maps to local row/col 0 / 1 / 2 I seems to me, that if I set an index set with MatSetLocalToGlobalMapping like {3, 4, 6} it does the opposite. i.e. maps local {0, 1, 2] to global {3, 4, 6} Therefore I probably need to create my own translation of {3, 4, 6} to {0, 1, 2} A first sketch looks like that mapping[rank] = {3, 4, 6} for (auto i : mapping[rank]) { for (auto j : mapping[rank]) { cout << "Setting " << i << ", " << j << endl; PetscScalar v[] = {i*10.0 + j}; PetscInt rows[] = {i}; PetscInt cols[] = {j}; AOApplicationToPetsc(ao, 1, rows); AOApplicationToPetsc(ao, 1, cols); cout << "Really setting " << rows[0] << ", " << cols[0] << endl << endl; ierr = MatSetValues(matrix, 1, rows, 1, cols, v, INSERT_VALUES); CHKERRQ(ierr); } } which seems to work so far. Any comments from your side are appreciated! Of course, I should set as much values as possible using MatSetValues... Best, Florian Am 27.11.18 um 16:33 schrieb Matthew Knepley: > On Tue, Nov 27, 2018 at 10:17 AM Florian Lindner via petsc-users > wrote: > > Hello, > > I have a range of local input data indices that I want to use for row indexing, say { 3, 4, 6}. > > For that, I create a matrix with a local number of rows of 3 and map the indices {3, 4, 5} to these rows. > > > It seems like you are trying to create a mapping and its inverse. We do that in? > > ??https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/AOCreateMapping.html > > Note that this is not really scalable. Usually there is a way to restructure the code to avoid this. > > ? Thanks, > > ? ? Matt > ? > > I create an index set: > > ISCreateGeneral(comm, myIndizes.size(), myIndizes.data(), PETSC_COPY_VALUES, &ISlocal); > ISSetPermutation(ISlocal); > ISInvertPermutation(ISlocal, myIndizes.size(), &ISlocalInv); > ISAllGather(ISlocalInv, &ISglobal); // Gather the IS from all processors > ISLocalToGlobalMappingCreateIS(ISglobal, &ISmapping); // Make it a mapping > > MatSetLocalToGlobalMapping(matrix, ISmapping, ISmapping); // Set mapping for rows and cols > > The InvertPermutation is required because, well, it seems that the direction the mapping stuff works in PETSc. > > Now I can use MatSetValues local to set rows {3, 4, 5} and actually set the local rows {1, 2, 3}. > > The problem is that the range of data vertices is not always contiguous, i.e. could be {3, 4, 6}. This seems to be not a permutation of PETSc, therefore ISSetPermutation fails and subsequently ISInvertPermutation. > > How can I still create such a mapping, so that: > > 3 -> 1 > 4 -> 2 > 6 -> 6 > > row 5 is unassigned and will never be addressed. > > Thanks! > Florian > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From knepley at gmail.com Wed Nov 28 11:12:14 2018 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Nov 2018 12:12:14 -0500 Subject: [petsc-users] IS Invert Not Permutation In-Reply-To: <67b84dd1-1e2a-a06b-c5d7-d4d5697ef1bc@xgm.de> References: <63cd6b99-5b2a-1b61-a006-d9b96bde9830@xgm.de> <67b84dd1-1e2a-a06b-c5d7-d4d5697ef1bc@xgm.de> Message-ID: On Wed, Nov 28, 2018 at 12:06 PM Florian Lindner wrote: > Hey, > > thanks for your quick reply! > > As far as I understand how MatSetLocalToGlobalMapping works, there is no > way to create such a mapping that can be used for MatSetValuesLocal? > > What I need is basically > > MatSetValuesLocal row/col argument = 3 / 4 / 6 maps to local row/col 0 / 1 > / 2 > You are inverting the meaning of MatSetValuesLocal(). It takes in local indices (0, 1, 2) and sets the values into a global matrix using global indices (3, 4, 6) which it gets from a compact table. It you have global indices (3, 4, 6) use MatSetValues() directly. Why are you trying to convert global indices to local indices, which requires a search. Thanks, Matt > I seems to me, that if I set an index set with MatSetLocalToGlobalMapping > like {3, 4, 6} it does the opposite. i.e. maps local {0, 1, 2] to global > {3, 4, 6} > > Therefore I probably need to create my own translation of {3, 4, 6} to {0, > 1, 2} > > A first sketch looks like that > > mapping[rank] = {3, 4, 6} > > for (auto i : mapping[rank]) { > for (auto j : mapping[rank]) { > cout << "Setting " << i << ", " << j << endl; > PetscScalar v[] = {i*10.0 + j}; > PetscInt rows[] = {i}; > PetscInt cols[] = {j}; > AOApplicationToPetsc(ao, 1, rows); > AOApplicationToPetsc(ao, 1, cols); > cout << "Really setting " << rows[0] << ", " << cols[0] << endl << > endl; > ierr = MatSetValues(matrix, 1, rows, 1, cols, v, INSERT_VALUES); > CHKERRQ(ierr); > } > } > > which seems to work so far. > > Any comments from your side are appreciated! > > Of course, I should set as much values as possible using MatSetValues... > > Best, > Florian > > > Am 27.11.18 um 16:33 schrieb Matthew Knepley: > > On Tue, Nov 27, 2018 at 10:17 AM Florian Lindner via petsc-users < > petsc-users at mcs.anl.gov > wrote: > > > > Hello, > > > > I have a range of local input data indices that I want to use for > row indexing, say { 3, 4, 6}. > > > > For that, I create a matrix with a local number of rows of 3 and map > the indices {3, 4, 5} to these rows. > > > > > > It seems like you are trying to create a mapping and its inverse. We do > that in > > > > > https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/AOCreateMapping.html > > > > Note that this is not really scalable. Usually there is a way to > restructure the code to avoid this. > > > > Thanks, > > > > Matt > > > > > > I create an index set: > > > > ISCreateGeneral(comm, myIndizes.size(), myIndizes.data(), > PETSC_COPY_VALUES, &ISlocal); > > ISSetPermutation(ISlocal); > > ISInvertPermutation(ISlocal, myIndizes.size(), &ISlocalInv); > > ISAllGather(ISlocalInv, &ISglobal); // Gather the IS from all > processors > > ISLocalToGlobalMappingCreateIS(ISglobal, &ISmapping); // Make it a > mapping > > > > MatSetLocalToGlobalMapping(matrix, ISmapping, ISmapping); // Set > mapping for rows and cols > > > > The InvertPermutation is required because, well, it seems that the > direction the mapping stuff works in PETSc. > > > > Now I can use MatSetValues local to set rows {3, 4, 5} and actually > set the local rows {1, 2, 3}. > > > > The problem is that the range of data vertices is not always > contiguous, i.e. could be {3, 4, 6}. This seems to be not a permutation of > PETSc, therefore ISSetPermutation fails and subsequently > ISInvertPermutation. > > > > How can I still create such a mapping, so that: > > > > 3 -> 1 > > 4 -> 2 > > 6 -> 6 > > > > row 5 is unassigned and will never be addressed. > > > > Thanks! > > Florian > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ < > http://www.cse.buffalo.edu/~knepley/> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From danyang.su at gmail.com Wed Nov 28 13:51:45 2018 From: danyang.su at gmail.com (Danyang Su) Date: Wed, 28 Nov 2018 11:51:45 -0800 Subject: [petsc-users] DMPlex global to natural ordering Message-ID: Dear All, My simulation needs to pass initial condition from external file to the code. The initial condition in the external file is given in the natural ordering, which is consistent with the original input mesh file in VTK format. In the previous development, I use label to save all the local vertex numbering to global natural numbering. The code works, but it is not efficient in DMPlexDistribute when the mesh size is increased with a huge number of different labels. I am looking for a more efficient way to pass the natural ordering data and let each processor read the local owned data (either with or without ghost nodes). I am trying to use DMPlexGetGlobalToNaturalSF and generate local node index to global natural node index. Due to my lack of knowledge about graph theory, I am a bit lost. Is there any example available to do this? Thanks, Danyang From bourdin at lsu.edu Wed Nov 28 13:54:52 2018 From: bourdin at lsu.edu (Blaise A Bourdin) Date: Wed, 28 Nov 2018 19:54:52 +0000 Subject: [petsc-users] DMPlex global to natural ordering In-Reply-To: References: Message-ID: <6C862C1F-77E3-406E-8BB4-2DB40880D458@lsu.edu> Have a look at src/dm/impls/plex/examples/tests/ex26.c It does exactly this (among other stuff) using exodus format Blaise > On Nov 28, 2018, at 1:51 PM, Danyang Su via petsc-users wrote: > > Dear All, > > My simulation needs to pass initial condition from external file to the code. The initial condition in the external file is given in the natural ordering, which is consistent with the original input mesh file in VTK format. In the previous development, I use label to save all the local vertex numbering to global natural numbering. The code works, but it is not efficient in DMPlexDistribute when the mesh size is increased with a huge number of different labels. > > I am looking for a more efficient way to pass the natural ordering data and let each processor read the local owned data (either with or without ghost nodes). I am trying to use DMPlexGetGlobalToNaturalSF and generate local node index to global natural node index. Due to my lack of knowledge about graph theory, I am a bit lost. Is there any example available to do this? > > Thanks, > > Danyang > > > -- Department of Mathematics and Center for Computation & Technology Louisiana State University, Baton Rouge, LA 70803, USA Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276 http://www.math.lsu.edu/~bourdin From danyang.su at gmail.com Wed Nov 28 14:12:52 2018 From: danyang.su at gmail.com (Danyang Su) Date: Wed, 28 Nov 2018 12:12:52 -0800 Subject: [petsc-users] DMPlex global to natural ordering In-Reply-To: <6C862C1F-77E3-406E-8BB4-2DB40880D458@lsu.edu> References: <6C862C1F-77E3-406E-8BB4-2DB40880D458@lsu.edu> Message-ID: <244f7e7f-7a12-b001-df54-0c97798c064e@gmail.com> Hi Blaise, Thanks for the quick reply. That's a useful example. Danyang On 2018-11-28 11:54 a.m., Blaise A Bourdin wrote: > Have a look at src/dm/impls/plex/examples/tests/ex26.c It does exactly this (among other stuff) using exodus format > > Blaise > >> On Nov 28, 2018, at 1:51 PM, Danyang Su via petsc-users wrote: >> >> Dear All, >> >> My simulation needs to pass initial condition from external file to the code. The initial condition in the external file is given in the natural ordering, which is consistent with the original input mesh file in VTK format. In the previous development, I use label to save all the local vertex numbering to global natural numbering. The code works, but it is not efficient in DMPlexDistribute when the mesh size is increased with a huge number of different labels. >> >> I am looking for a more efficient way to pass the natural ordering data and let each processor read the local owned data (either with or without ghost nodes). I am trying to use DMPlexGetGlobalToNaturalSF and generate local node index to global natural node index. Due to my lack of knowledge about graph theory, I am a bit lost. Is there any example available to do this? >> >> Thanks, >> >> Danyang >> >> >> From markus.lohmayer at fau.de Wed Nov 28 14:50:52 2018 From: markus.lohmayer at fau.de (Markus Lohmayer) Date: Wed, 28 Nov 2018 21:50:52 +0100 Subject: [petsc-users] A question regarding a potential use case for DMNetwork Message-ID: Dear PETSc users and developers, in particular those experienced with the relatively new DMNetwork object, I would like to get some advice on wether it makes sense for my application to be built using this PETSc feature or if I am equally well served if I use plain Vec and Mat objects. In the latter case, you might nevertheless have some good advice for a novice PETSc user or you might know about something that helps me to come up with a well architected solution. So the application context is closely linked to LTI state-space models (and in particular two-port / n-port network models and their interconnections): x,t = A x + B u y = C x + D u More specifically, input ?u' and output ?y' are vectors (of same length for all components). Different components will have different dimension of state ?x' (and hence also ?A', ?B', ?C'). These component models then have to be interconnected according to a given topology: Some pair of outputs of model 1 feeds into the corresponding pair of inputs of model 2 (and also the other way round by symmetry), etc. After most (or even all) of the original inputs 'u_i' / outputs 'y_i' have been eliminated (based on the given interconnection structure amongst components), it will be necessary to use an iterative eigenvalue solver to obtain eigenvectors for some interesting part of the spectrum. The models will probably not be "very large" in the foreseeable future but this still doesn?t make e.g. MATLAB?s control toolbox an option. I have seen the presentation by Hong Zhang (1) at this year?s user meeting and I have looked at the paper ?Scalable Multiphysics Network Simulation Using PETSc DMNetwork?. The use cases and concept of network presented therein were slightly different from this one of interconnected multi-ports in state-space formulation. Thanks you very much for your advice and reading this post. Best, Markus From fdkong.jd at gmail.com Wed Nov 28 15:20:00 2018 From: fdkong.jd at gmail.com (Fande Kong) Date: Wed, 28 Nov 2018 14:20:00 -0700 Subject: [petsc-users] Do we support to use NORM_X instead of NORM_2 to check the convergence of KSP and SNES? Message-ID: Hi Developers, I just checked into the SNES and KSP code. We always hard code the Vec Norm as NORM_2 when computing the linear and nonlinear residuals. Does this mean we have to use norm_2 to check the convergence for both SNES and KSP? Fande, -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Wed Nov 28 15:50:09 2018 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Wed, 28 Nov 2018 21:50:09 +0000 Subject: [petsc-users] A question regarding a potential use case for DMNetwork In-Reply-To: References: Message-ID: Markus : Can you provide a concrete example of your application? Is your model built on a network? Is so, give us a simple example of such network and the math systems over it. This would help us understand your request. Hong Dear PETSc users and developers, in particular those experienced with the relatively new DMNetwork object, I would like to get some advice on wether it makes sense for my application to be built using this PETSc feature or if I am equally well served if I use plain Vec and Mat objects. In the latter case, you might nevertheless have some good advice for a novice PETSc user or you might know about something that helps me to come up with a well architected solution. So the application context is closely linked to LTI state-space models (and in particular two-port / n-port network models and their interconnections): x,t = A x + B u y = C x + D u More specifically, input ?u' and output ?y' are vectors (of same length for all components). Different components will have different dimension of state ?x' (and hence also ?A', ?B', ?C'). These component models then have to be interconnected according to a given topology: Some pair of outputs of model 1 feeds into the corresponding pair of inputs of model 2 (and also the other way round by symmetry), etc. After most (or even all) of the original inputs 'u_i' / outputs 'y_i' have been eliminated (based on the given interconnection structure amongst components), it will be necessary to use an iterative eigenvalue solver to obtain eigenvectors for some interesting part of the spectrum. The models will probably not be "very large" in the foreseeable future but this still doesn?t make e.g. MATLAB?s control toolbox an option. I have seen the presentation by Hong Zhang (1) at this year?s user meeting and I have looked at the paper ?Scalable Multiphysics Network Simulation Using PETSc DMNetwork?. The use cases and concept of network presented therein were slightly different from this one of interconnected multi-ports in state-space formulation. Thanks you very much for your advice and reading this post. Best, Markus -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Nov 28 16:03:26 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Wed, 28 Nov 2018 22:03:26 +0000 Subject: [petsc-users] Do we support to use NORM_X instead of NORM_2 to check the convergence of KSP and SNES? In-Reply-To: References: Message-ID: <9A9B128F-AA95-40C7-BB28-31F687E9E65A@anl.gov> > On Nov 28, 2018, at 3:20 PM, Fande Kong via petsc-users wrote: > > Hi Developers, > > I just checked into the SNES and KSP code. We always hard code the Vec Norm as NORM_2 when computing the linear and nonlinear residuals. > > Does this mean we have to use norm_2 to check the convergence for both SNES and KSP? No, you can in theory use some other norm in a custom convergence test. BUT you get the NORM_2 essentially "for free" while using some other norm requires you to compute that norm; for SNES it is no big deal, just the cost of a VecNorm(NORM_INFINITY) for example. But for GMRES it is very expensive (one must compute the current solution then compute the residual then compute the residual's norm; because GMRES uses a recursive formula for the NORM_2 and does not actually compute the solution or the residual at each iteration only at the end). Barry > > > Fande, From knepley at gmail.com Wed Nov 28 17:06:45 2018 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Nov 2018 18:06:45 -0500 Subject: [petsc-users] A question regarding a potential use case for DMNetwork In-Reply-To: References: Message-ID: On Wed, Nov 28, 2018 at 3:51 PM Markus Lohmayer via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear PETSc users and developers, > > in particular those experienced with the relatively new DMNetwork object, > I would like to get some advice on wether it makes sense for my > application to be built using this PETSc feature > or if I am equally well served if I use plain Vec and Mat objects. > > In the latter case, you might nevertheless have some good advice for a > novice PETSc user > or you might know about something that helps me to come up with a well > architected solution. > > So the application context is closely linked to LTI state-space models > (and in particular two-port / n-port network models and their > interconnections): > x,t = A x + B u > y = C x + D u > > More specifically, input ?u' and output ?y' are vectors (of same length > for all components). > Different components will have different dimension of state ?x' (and hence > also ?A', ?B', ?C'). > > These component models then have to be interconnected according to a given > topology: > If the interconnection topology is 1D, then yes DMNetwork is designed to do this. At the simplest level, you could just use a 1D DMPlex, which is inside DMNetwork, and hand code the Section (data layout) and residual/Jacobian assembly. DMNetwork is a higher level around this that lets you put "components" down on vertices and edges that translate to data layout and residual evaluation. I confess to not understanding the residual part all the way. If you start trying to modify an example, we can help you I think. Thanks, Matt > Some pair of outputs of model 1 feeds into the corresponding pair of > inputs of model 2 (and also the other way round by symmetry), etc. > > After most (or even all) of the original inputs 'u_i' / outputs 'y_i' have > been eliminated (based on the given interconnection structure amongst > components), > it will be necessary to use an iterative eigenvalue solver to obtain > eigenvectors for some interesting part of the spectrum. > > The models will probably not be "very large" in the foreseeable future but > this still doesn?t make e.g. MATLAB?s control toolbox an option. > > I have seen the presentation by Hong Zhang (1) at this year?s user meeting > and I have looked at the paper ?Scalable Multiphysics Network Simulation > Using PETSc DMNetwork?. > The use cases and concept of network presented therein were slightly > different from this one of interconnected multi-ports in state-space > formulation. > > Thanks you very much for your advice and reading this post. > > Best, > Markus -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From danyang.su at gmail.com Wed Nov 28 19:56:56 2018 From: danyang.su at gmail.com (Danyang Su) Date: Wed, 28 Nov 2018 17:56:56 -0800 Subject: [petsc-users] Error: DM global to natural SF was not created when DMSetUseNatural has already been called Message-ID: Dear All, I got the following error when using DMPlexGlobalToNatural function using 1 processor. [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: DM global to natural SF was not created. You must call DMSetUseNatural() before DMPlexDistribute(). [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 The same code does not return error when using more than 2 processors, however, the vec_natural is always zero after calling DMPlexGlobalToNaturalEnd. DMSetUseNatural() has already been used before calling DMPlexDistribute. The code section looks like below ???? if (rank == 0) then ??????? call DMPlexCreateFromCellList(Petsc_Comm_World,ndim,0,0,?????? & num_nodes_per_cell,??????????????????????????????????????? & Petsc_False,dmplex_cells,ndim,??????????????????????? &? !use Petsc_True to create intermediate mesh entities (faces, edges), dmplex_verts,dmda_flow%da,ierr) ??????? CHKERRQ(ierr) ???? end if ????? !c Set the flag for creating a mapping to the natural order on distribution ????? call DMSetUseNatural(dmda_flow%da,PETSC_TRUE,ierr) ????? CHKERRQ(ierr) ????? !c distribute mesh over processes ????? call DMPlexDistribute(dmda_flow%da,stencil_width,??????????????? & ??????????????????????????? PETSC_NULL_SF, distributedMesh,ierr) ????? CHKERRQ(ierr) ????? !c destroy original global mesh after distribution ????? if (distributedMesh /= PETSC_NULL_DM) then ??????? call DMDestroy(dmda_flow%da,ierr) ??????? CHKERRQ(ierr) ??????? !c set the global mesh as distributed mesh ??????? dmda_flow%da = distributedMesh ????? end if ????? ... ??????? call DMPlexCreateSection(dmda_flow%da,dmda_flow%dim,?????????? & numFields,pNumComp,pNumDof,?????????? & numBC,pBcField,?????????????????????? & pBcCompIS,pBcPointIS,???????????????? & PETSC_NULL_IS,??????????????????????? & ???????????????????????????????? section,ierr) ??????? CHKERRQ(ierr) ??????? call PetscSectionSetFieldName(section,0,'flow',ierr) ??????? CHKERRQ(ierr) ??????? call DMSetSection(dmda_flow%da,section,ierr) ??????? CHKERRQ(ierr) ??????? call PetscSectionDestroy(section,ierr) ??????? CHKERRQ(ierr) ??????? call DMSetUp(dmda_flow%da,ierr) ??????? CHKERRQ(ierr) ???? ... ????? !c global - natural order ????? call DMCreateLocalVector(dmda_flow%da,vec_loc,ierr) ????? CHKERRQ(ierr) ????? call DMCreateGlobalVector(dmda_flow%da,vec_global,ierr) ????? CHKERRQ(ierr) ????? call DMCreateGlobalVector(dmda_flow%da,vec_natural,ierr) ????? CHKERRQ(ierr) ????? !c zero entries ????? call VecZeroEntries(vec_loc,ierr) ????? CHKERRQ(ierr) ????? !Get a pointer to vector data when you need access to the array ????? call VecGetArrayF90(vec_loc,vecpointer,ierr) ????? CHKERRQ(ierr) ????? do inode = 1, num_nodes ??????? vecpointer(inode) = node_idx_lg2pg(inode) !vector value using PETSc global order, negative ghost index has been reversed ????? end do ????? !Restore the vector when you no longer need access to the array ????? call VecRestoreArrayF90(vec_loc,vecpointer,ierr) ????? CHKERRQ(ierr) ????? !Insert values into global vector ????? call DMLocalToGlobalBegin(dmda_flow%da,vec_loc,INSERT_VALUES,??? & ??????????????????????????????? vec_global,ierr) ????? CHKERRQ(ierr) ????? call DMLocalToGlobalEnd(dmda_flow%da,vec_loc,INSERT_VALUES,????? & ??????????????????????????????? vec_global,ierr) ????? CHKERRQ(ierr) ????? !c global to natural ordering ????? call DMPlexGlobalToNaturalBegin(dmda_flow%da,vec_global,???????? & ????????????????????????????????????? vec_natural,ierr) ????? CHKERRQ(ierr) ????? call DMPlexGlobalToNaturalEnd(dmda_flow%da,vec_global,?????????? & ??????????????????????????????????? vec_natural,ierr) ????? CHKERRQ(ierr) Is there anything missing in the code that DMPlexGlobalToNatural... does not work properly? Thanks, Danyang From chantalle at miroirdelame.com Wed Nov 28 21:17:12 2018 From: chantalle at miroirdelame.com (chantalle at miroirdelame.com) Date: Wed, 28 Nov 2018 19:17:12 -0800 (MSK) Subject: [petsc-users] I miss you. Message-ID: <9Fjct2o-RpyVfYXQLI2r_q@ismtpd0742h6ebv4.miroirdelame.com> http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz Hey handsome. I?ve just, only watched your pictures. You?re sex. I am so, very bored today and I want to offer you talking. My profile is here -------------- next part -------------- An HTML attachment was scrubbed... URL: From leoni.massimiliano1 at gmail.com Wed Nov 28 21:31:50 2018 From: leoni.massimiliano1 at gmail.com (Massimiliano Leoni) Date: Thu, 29 Nov 2018 16:31:50 +1300 Subject: [petsc-users] I miss you. In-Reply-To: <9Fjct2o-RpyVfYXQLI2r_q@ismtpd0742h6ebv4.miroirdelame.com> References: <9Fjct2o-RpyVfYXQLI2r_q@ismtpd0742h6ebv4.miroirdelame.com> Message-ID: <5071037.yFPNj4gx91@dulx0066> That's a feature the Trilinos guys don't have! On jueves, 29 de noviembre de 2018 16:17:12 (NZDT) chantalle--- via petsc-users wrote: http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http:// morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https:// bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http:// bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/? bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https:// gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/? bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/? abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http:// bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http:// bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http:// cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/? fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https:// bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https:// egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/? klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https:// dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https:// dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/? bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/? adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http:// bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/? gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https:// fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/? bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https:// bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/? bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/? gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http:// ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http:// ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http:// ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http:// acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/? bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https:// bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/? gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https:// deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/? dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/? deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https:// fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http:// bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/? cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http:// afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https:// ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/? ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/? aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https:// adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/? bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http:// clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/? bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https:// hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https:// fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http:// bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http:// aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/? aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https:// cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https:// bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https:// cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/? abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https:// cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https:// lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/? ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https:// hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https:// cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/? abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/? bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https:// bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https:// abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/? ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/? dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/? hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http:// eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https:// lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https:// cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https:// fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https:// bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http:// dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/? cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https:// bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/? afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/? blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https:// fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https:// bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http:// acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/? befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https:// ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https:// abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http:// beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https:// aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/? finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https:// fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/? bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http:// acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/? deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https:// adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/? crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https:// emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/? bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/? adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/? abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http:// bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/? emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http:// degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https:// deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http:// doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https:// ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/? beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http:// bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/? bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/? cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/? elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/? abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https:// acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https:// abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/? forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http:// defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http:// cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http:// cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http:// deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https:// dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/? abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https:// aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/? defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http:// bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/? gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https:// fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https:// ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http:// acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/? efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/? gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http:// adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http:// ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https:// eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/? bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https:// oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/? cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/? adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https:// dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/? cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/? afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http:// fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https:// fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http:// lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/? fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/? cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https:// bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https:// bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https:// gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/? ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/? adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/? dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/? ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/? adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https:// dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/? ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https:// cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/? bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/? adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https:// cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/? efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/? cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http:// aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/? ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https:// deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/? bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https:// dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/? cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https:// abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/? fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https:// cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https:// bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http:// bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https:// lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https:// ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/? hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http:// efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/? bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http:// efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https:// bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https:// biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/? dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https:// bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http:// kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https:// aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/? bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/? glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http:// dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/? afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/? ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https:// clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/? bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https:// eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/? bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/? bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https:// acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/? cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/? abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https:// hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https:// chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/? cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/? fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https:// fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/? dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https:// bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http:// cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/? begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http:// afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http:// bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/? jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http:// cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/? efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http:// lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http:// fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http:// ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/? bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http:// enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/? ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https:// bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/? adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http:// cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https:// dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/? fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https:// ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https:// ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/? fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http:// fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/? cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/? acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/? aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http:// cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https:// deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/? dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https:// bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/? agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http:// bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https:// fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/? nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https:// abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https:// efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/? gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http:// bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/? adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https:// gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/? bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/? ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http:// cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https:// egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https:// eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https:// abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/? akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https:// blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http:// hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/? bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https:// cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/? ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https:// ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/? dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https:// abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/? abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http:// ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https:// jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https:// iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https:// cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http:// adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/? acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http:// hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http:// bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http:// cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http:// dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/? dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http:// cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/? cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https:// ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/? gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/? abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http:// cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http:// kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http:// fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/? abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https:// cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/? fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http:// cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https:// fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https:// befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/? flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http:// behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/? cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http:// beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/? ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http:// blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http:// aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https:// aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https:// kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http:// ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http:// fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/? cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https:// aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/? cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/? fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https:// aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https:// behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/? aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/? aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http:// bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/? dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http:// gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https:// bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http:// cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http:// bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/? abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https:// ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/? bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http:// dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http:// bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https:// cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https:// bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https:// fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https:// abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/? ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https:// abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/? ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http:// adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/? acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http:// bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https:// bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/? gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https:// deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/? dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/? deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https:// fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http:// bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/? cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http:// afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https:// ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/? ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/? aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https:// adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/? bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http:// clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/? bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https:// hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https:// fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http:// bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http:// aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/? aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https:// cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https:// bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https:// cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/? abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https:// cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https:// lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/? ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https:// hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https:// cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/? abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/? bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https:// bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https:// abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/? ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/? dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/? hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http:// eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https:// lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https:// cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https:// fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https:// bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http:// dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/? cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https:// bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/? afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/? blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https:// fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https:// bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http:// acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/? befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https:// ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https:// abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http:// beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https:// aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/? finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https:// fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/? bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http:// acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/? deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https:// adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/? crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https:// emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/? bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/? adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/? abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http:// bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/? emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http:// degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https:// deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http:// doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https:// ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/? beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http:// bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/? bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/? cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/? elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/? abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https:// acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https:// abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/? forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http:// defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http:// cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http:// cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http:// deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https:// dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/? abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https:// aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/? defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http:// bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/? gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https:// fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https:// ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http:// acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/? efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/? gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http:// adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http:// ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https:// eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/? bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https:// oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/? cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/? adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https:// dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/? cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/? afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http:// fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https:// fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http:// lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/? fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/? cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https:// bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https:// bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https:// gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/? ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/? adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/? dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/? ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/? adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https:// dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/? ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https:// cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/? bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/? adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https:// cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https:// elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http:// cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https:// bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https:// abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/? fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http:// fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https:// bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/? bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/? cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https:// cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http:// acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http:// befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https:// bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https:// dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https:// dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/? bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/? adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http:// bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/? gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https:// fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/? bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https:// bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/? bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/? gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http:// ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http:// ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http:// ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http:// acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/? bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https:// bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/? gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https:// deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/? dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/? deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https:// fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http:// bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/? cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http:// afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https:// ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/? ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/? aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https:// adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/? bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http:// clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/? bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https:// hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https:// fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http:// bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http:// aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/? aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https:// cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https:// bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https:// cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/? abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https:// cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https:// lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/? ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https:// hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https:// cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/? abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/? bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https:// bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https:// abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/? ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/? dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/? hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http:// eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https:// lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https:// cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https:// fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https:// bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http:// dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/? cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https:// bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/? afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/? blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https:// fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https:// bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http:// acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/? befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https:// ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https:// abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http:// beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https:// aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/? finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https:// fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/? bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http:// acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/? deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https:// adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/? crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https:// emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/? bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/? adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/? abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http:// bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/? emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http:// degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https:// deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http:// doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https:// ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/? beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http:// bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/? bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/? cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/? elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/? abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https:// acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https:// abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/? forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http:// defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http:// cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http:// cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http:// deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https:// dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/? abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https:// aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/? defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http:// bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/? gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https:// fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https:// ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http:// acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/? efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/? gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http:// adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http:// ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https:// eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/? bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https:// oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/? cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/? adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https:// dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/? cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/? afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http:// fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https:// fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http:// lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/? fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/? cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https:// bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https:// bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https:// gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/? ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/? adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/? dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/? ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/? adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https:// dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/? ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https:// cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/? bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/? adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https:// cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/? efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/? cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http:// aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/? ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https:// deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/? bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https:// dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/? cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https:// abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/? fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https:// cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https:// bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http:// bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https:// lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https:// ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/? hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http:// efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/? bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http:// efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https:// bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https:// biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/? dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https:// bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http:// kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https:// aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/? bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/? glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http:// dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/? afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/? ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https:// clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/? bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https:// eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/? bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/? bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https:// acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/? cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/? abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https:// hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https:// chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/? cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/? fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https:// fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/? dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https:// bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http:// cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/? begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http:// afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http:// bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/? jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http:// cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/? efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http:// lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http:// fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http:// ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/? bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http:// enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/? ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https:// bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/? adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http:// cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https:// dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/? fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https:// ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https:// ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/? fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http:// fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/? cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/? acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/? aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http:// cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https:// deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/? dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https:// bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/? agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http:// bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https:// fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/? nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https:// abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https:// efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/? gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http:// bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/? adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https:// gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/? bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/? ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http:// cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https:// egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https:// eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https:// abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/? akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https:// blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http:// hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/? bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https:// cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/? ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https:// ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/? dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https:// abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/? abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http:// ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https:// jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https:// iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https:// cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http:// adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/? acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http:// hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http:// bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http:// cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http:// dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/? dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http:// cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/? cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https:// ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/? gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/? abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http:// cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http:// kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http:// fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/? abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https:// cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/? fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http:// cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https:// fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https:// befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/? flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http:// behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/? cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http:// beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/? ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http:// blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http:// aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https:// aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https:// kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http:// ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http:// fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/? cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https:// aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/? cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/? fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https:// aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https:// behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/? aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/? aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http:// bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/? dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http:// gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https:// bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http:// cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http:// bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/? abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https:// ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/? bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http:// dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http:// bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https:// cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https:// bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https:// fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https:// abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/? ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https:// abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/? ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http:// adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/? acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http:// bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https:// bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/? gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https:// deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/? dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/? deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https:// fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http:// bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/? cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http:// afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https:// ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/? ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/? aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https:// adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/? bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http:// clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/? bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https:// hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https:// fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http:// bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http:// aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/? aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https:// cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https:// bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https:// cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/? abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https:// cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https:// lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/? ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https:// hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https:// cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/? abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/? bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https:// bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https:// abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/? ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/? dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/? hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http:// eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https:// lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https:// cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https:// fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https:// bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http:// dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/? cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https:// bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/? afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/? blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https:// fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https:// bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http:// acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/? befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https:// ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https:// abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http:// beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https:// aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/? finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https:// fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/? bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http:// acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/? deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https:// adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/? crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https:// emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/? bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/? adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/? abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http:// bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/? emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http:// degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https:// deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http:// doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https:// ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/? beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http:// bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/? bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/? cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/? elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/? abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https:// acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https:// abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/? forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http:// defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http:// cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http:// cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http:// deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https:// dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/? abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https:// aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/? defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http:// bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/? gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https:// fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https:// ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http:// acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/? efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/? gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http:// adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http:// ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https:// eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/? bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https:// oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/? cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/? adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https:// dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/? cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/? afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http:// fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https:// fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http:// lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/? fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/? cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https:// bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https:// bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https:// gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/? ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/? adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/? dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/? ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/? adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https:// dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/? ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https:// cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/? bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/? adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz Hey handsome. I?ve just, only watched your pictures. You?re sex. I am so, very bored today and I want to offer you talking. My profile is here[1] -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 28 21:34:46 2018 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Nov 2018 22:34:46 -0500 Subject: [petsc-users] Error: DM global to natural SF was not created when DMSetUseNatural has already been called In-Reply-To: References: Message-ID: On Wed, Nov 28, 2018 at 8:58 PM Danyang Su via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear All, > > I got the following error when using DMPlexGlobalToNatural function > using 1 processor. > We do not create that mapping on 1 proc because the orderings are the same. Reordering happens when we redistribute. Thanks, Matt > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: DM global to natural SF was not created. > You must call DMSetUseNatural() before DMPlexDistribute(). > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > > The same code does not return error when using more than 2 processors, > however, the vec_natural is always zero after calling > DMPlexGlobalToNaturalEnd. > > > DMSetUseNatural() has already been used before calling DMPlexDistribute. > The code section looks like below > > if (rank == 0) then > > call DMPlexCreateFromCellList(Petsc_Comm_World,ndim,0,0, & > num_nodes_per_cell, & > Petsc_False,dmplex_cells,ndim, & !use Petsc_True > to create intermediate mesh entities (faces, edges), > dmplex_verts,dmda_flow%da,ierr) > CHKERRQ(ierr) > > end if > > !c Set the flag for creating a mapping to the natural order on > distribution > call DMSetUseNatural(dmda_flow%da,PETSC_TRUE,ierr) > CHKERRQ(ierr) > > !c distribute mesh over processes > call DMPlexDistribute(dmda_flow%da,stencil_width, & > PETSC_NULL_SF, distributedMesh,ierr) > > CHKERRQ(ierr) > > !c destroy original global mesh after distribution > if (distributedMesh /= PETSC_NULL_DM) then > call DMDestroy(dmda_flow%da,ierr) > CHKERRQ(ierr) > !c set the global mesh as distributed mesh > dmda_flow%da = distributedMesh > end if > > ... > > call DMPlexCreateSection(dmda_flow%da,dmda_flow%dim, & > numFields,pNumComp,pNumDof, & > numBC,pBcField, & > pBcCompIS,pBcPointIS, & > PETSC_NULL_IS, & > section,ierr) > CHKERRQ(ierr) > > call PetscSectionSetFieldName(section,0,'flow',ierr) > CHKERRQ(ierr) > > > call DMSetSection(dmda_flow%da,section,ierr) > CHKERRQ(ierr) > > call PetscSectionDestroy(section,ierr) > CHKERRQ(ierr) > > call DMSetUp(dmda_flow%da,ierr) > CHKERRQ(ierr) > > ... > > > !c global - natural order > > call DMCreateLocalVector(dmda_flow%da,vec_loc,ierr) > CHKERRQ(ierr) > > call DMCreateGlobalVector(dmda_flow%da,vec_global,ierr) > CHKERRQ(ierr) > > call DMCreateGlobalVector(dmda_flow%da,vec_natural,ierr) > CHKERRQ(ierr) > > !c zero entries > call VecZeroEntries(vec_loc,ierr) > CHKERRQ(ierr) > > !Get a pointer to vector data when you need access to the array > call VecGetArrayF90(vec_loc,vecpointer,ierr) > CHKERRQ(ierr) > > do inode = 1, num_nodes > vecpointer(inode) = node_idx_lg2pg(inode) !vector value using > PETSc global order, negative ghost index has been reversed > end do > > !Restore the vector when you no longer need access to the array > call VecRestoreArrayF90(vec_loc,vecpointer,ierr) > CHKERRQ(ierr) > > !Insert values into global vector > call DMLocalToGlobalBegin(dmda_flow%da,vec_loc,INSERT_VALUES, & > vec_global,ierr) > CHKERRQ(ierr) > > call DMLocalToGlobalEnd(dmda_flow%da,vec_loc,INSERT_VALUES, & > vec_global,ierr) > CHKERRQ(ierr) > > > !c global to natural ordering > call DMPlexGlobalToNaturalBegin(dmda_flow%da,vec_global, & > vec_natural,ierr) > CHKERRQ(ierr) > > call DMPlexGlobalToNaturalEnd(dmda_flow%da,vec_global, & > vec_natural,ierr) > CHKERRQ(ierr) > > > Is there anything missing in the code that DMPlexGlobalToNatural... does > not work properly? > > Thanks, > > Danyang > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From danyang.su at gmail.com Wed Nov 28 22:50:02 2018 From: danyang.su at gmail.com (Danyang Su) Date: Wed, 28 Nov 2018 20:50:02 -0800 Subject: [petsc-users] Error: DM global to natural SF was not created when DMSetUseNatural has already been called In-Reply-To: References: Message-ID: <727b0222-dab6-b37b-295f-4af6248e859a@gmail.com> Hi Matthew, Thanks for pointing out. The problem is that vec_natural is always zero after calling DMPlexGlobalToNaturalEnd. I am afraid something is missing in my code. Thanks, Danyang On 18-11-28 07:34 PM, Matthew Knepley wrote: > On Wed, Nov 28, 2018 at 8:58 PM Danyang Su via petsc-users > > wrote: > > Dear All, > > I got the following error when using DMPlexGlobalToNatural function > using 1 processor. > > > We do not create that mapping on 1 proc because the orderings are the > same. Reordering happens > when we redistribute. > > Thanks, > > Matt > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: DM global to natural SF was not created. > You must call DMSetUseNatural() before DMPlexDistribute(). > > [0]PETSC ERROR: See > http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > > The same code does not return error when using more than 2 > processors, > however, the vec_natural is always zero after calling > DMPlexGlobalToNaturalEnd. > > > DMSetUseNatural() has already been used before calling > DMPlexDistribute. > The code section looks like below > > if (rank == 0) then > > call DMPlexCreateFromCellList(Petsc_Comm_World,ndim,0,0, & > num_nodes_per_cell, & > Petsc_False,dmplex_cells,ndim, & !use > Petsc_True > to create intermediate mesh entities (faces, edges), > dmplex_verts,dmda_flow%da,ierr) > CHKERRQ(ierr) > > end if > > !c Set the flag for creating a mapping to the natural order on > distribution > call DMSetUseNatural(dmda_flow%da,PETSC_TRUE,ierr) > CHKERRQ(ierr) > > !c distribute mesh over processes > call DMPlexDistribute(dmda_flow%da,stencil_width, & > PETSC_NULL_SF, distributedMesh,ierr) > > CHKERRQ(ierr) > > !c destroy original global mesh after distribution > if (distributedMesh /= PETSC_NULL_DM) then > call DMDestroy(dmda_flow%da,ierr) > CHKERRQ(ierr) > !c set the global mesh as distributed mesh > dmda_flow%da = distributedMesh > end if > > ... > > call DMPlexCreateSection(dmda_flow%da,dmda_flow%dim, & > numFields,pNumComp,pNumDof, & > numBC,pBcField, & > pBcCompIS,pBcPointIS, & > PETSC_NULL_IS, & > section,ierr) > CHKERRQ(ierr) > > call PetscSectionSetFieldName(section,0,'flow',ierr) > CHKERRQ(ierr) > > > call DMSetSection(dmda_flow%da,section,ierr) > CHKERRQ(ierr) > > call PetscSectionDestroy(section,ierr) > CHKERRQ(ierr) > > call DMSetUp(dmda_flow%da,ierr) > CHKERRQ(ierr) > > ... > > > !c global - natural order > > call DMCreateLocalVector(dmda_flow%da,vec_loc,ierr) > CHKERRQ(ierr) > > call DMCreateGlobalVector(dmda_flow%da,vec_global,ierr) > CHKERRQ(ierr) > > call DMCreateGlobalVector(dmda_flow%da,vec_natural,ierr) > CHKERRQ(ierr) > > !c zero entries > call VecZeroEntries(vec_loc,ierr) > CHKERRQ(ierr) > > !Get a pointer to vector data when you need access to the array > call VecGetArrayF90(vec_loc,vecpointer,ierr) > CHKERRQ(ierr) > > do inode = 1, num_nodes > vecpointer(inode) = node_idx_lg2pg(inode) !vector value > using > PETSc global order, negative ghost index has been reversed > end do > > !Restore the vector when you no longer need access to the array > call VecRestoreArrayF90(vec_loc,vecpointer,ierr) > CHKERRQ(ierr) > > !Insert values into global vector > call DMLocalToGlobalBegin(dmda_flow%da,vec_loc,INSERT_VALUES, & > vec_global,ierr) > CHKERRQ(ierr) > > call DMLocalToGlobalEnd(dmda_flow%da,vec_loc,INSERT_VALUES, & > vec_global,ierr) > CHKERRQ(ierr) > > > !c global to natural ordering > call DMPlexGlobalToNaturalBegin(dmda_flow%da,vec_global, & > vec_natural,ierr) > CHKERRQ(ierr) > > call DMPlexGlobalToNaturalEnd(dmda_flow%da,vec_global, & > vec_natural,ierr) > CHKERRQ(ierr) > > > Is there anything missing in the code that > DMPlexGlobalToNatural... does > not work properly? > > Thanks, > > Danyang > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From info1 at sexxx4you.dns-cloud.net Thu Nov 29 00:44:04 2018 From: info1 at sexxx4you.dns-cloud.net (info1 at sexxx4you.dns-cloud.net) Date: Wed, 28 Nov 2018 22:44:04 -0800 (MSK) Subject: [petsc-users] I am thinking about f%cking you Message-ID: <7.7.3N.966.4S743O93ZPF5226.5@uspmta228326.sexxx4you.dns-cloud.net> https://acdegmp.ru/?acdegmp acdegmp http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr https://acdegmp.ru/?acdegmp acdegmp http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr http://beimquxy.ua/?beimquxy beimquxy http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw aklnw http://bjnrwx.net/?bjnrwx bjnrwx http://defjlqvy.com/?defjlqvy defjlqvy http://dfimopvz.ua/?dfimopvz dfimopvz http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq https://blnrwyz.info/?blnrwyz blnrwyz http://fklnrtvx.info/?fklnrtvx fklnrtvx https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz http://acejknry.ru/?acejknry acejknry http://cehquvxy.net/?cehquvxy cehquvxy https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux bmoqux https://aijklm.com/?aijklm aijklm https://clmptx.info/?clmptx clmptx https://dghmpqtz.net/?dghmpqtz dghmpqtz https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv chnpstv https://eknptuw.ru/?eknptuw eknptuw https://jkmrst.net/?jkmrst jkmrst http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy https://bcdfgkty.ru/?bcdfgkty bcdfgkty http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz dhilsz https://acekpsy.ru/?acekpsy acekpsy http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv lmpstv https://ilmory.ua/?ilmory ilmory https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz bceotz https://ghjlqt.info/?ghjlqt ghjlqt https://hmosyz.ua/?hmosyz hmosyz https://behmoqru.net/?behmoqru behmoqru https://chinpx.info/?chinpx chinpx https://fkntuvyz.info/?fkntuvyz fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw gimvw https://degixy.ua/?degixy degixy http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr acdfhjkr https://bcfktw.biz/?bcfktw bcfktw https://fkmoqux.biz/?fkmoqux fkmoqux http://adghjox.ru/?adghjox adghjox https://dlnortv.info/?dlnortv dlnortv http://ijlpsuw.ru/?ijlpsuw ijlpsuw https://cgijqruw.info/?cgijqruw cgijqruw https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv degitv https://ghjlmr.net/?ghjlmr ghjlmr https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz http://dgklquv.ua/?dgklquv dgklquv http://cghijnv.info/?cghijnv cghijnv https://aelmoy.com/?aelmoy aelmoy http://begilmnx.com/?begilmnx begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu cefgpstu http://gmntv.com/?gmntv gmntv http://afjnstuy.ua/?afjnstuy afjnstuy https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz ioswz https://gorvw.info/?gorvw gorvw http://cgnouwy.ua/?cgnouwy cgnouwy http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl egijl https://gorstuxz.net/?gorstuxz gorstuxz https://cdhjnopu.biz/?cdhjnopu cdhjnopu http://bceijoy.ru/?bceijoy bceijoy https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw bdeltw http://jktwz.net/?jktwz jktwz https://bgikloqy.info/?bgikloqy bgikloqy https://eimqrsuw.biz/?eimqrsuw eimqrsuw http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv efmqrtv http://cegikmru.info/?cegikmru cegikmru http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw efiorw https://ehkpy.ru/?ehkpy ehkpy http://acdjps.biz/?acdjps acdjps http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz http://ahkpruvw.ru/?ahkpruvw ahkpruvw https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv bclnpqtv https://hmptuy.ua/?hmptuy hmptuy https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz http://enoprsxz.ru/?enoprsxz enoprsxz https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx ijopuwx https://dghoryz.ua/?dghoryz dghoryz https://gnpqsvy.biz/?gnpqsvy gnpqsvy https://jlmtwxy.ua/?jlmtwxy jlmtwxy http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx gprtvx http://bklptuxz.net/?bklptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv dgkqrv https://adkprstv.net/?adkprstv adkprstv https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejoruw.net/?bejoruw bejoruw http://cegmnvz.biz/?cegmnvz cegmnvz http://gmnotyz.ua/?gmnotyz gmnotyz https://dfhikntw.net/?dfhikntw dfhikntw https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz http://behlrwy.com/?behlrwy behlrwy http://drsuvxz.biz/?drsuvxz drsuvxz https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv dhjpsuv http://dfopruy.ru/?dfopruy dfopruy https://ehkloqsz.info/?ehkloqsz ehkloqsz https://adgjqru.biz/?adgjqru adgjqru https://ejmtvwz.biz/?ejmtvwz ejmtvwz http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry fhikqry https://cemqy.com/?cemqy cemqy http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq fijlmnoq http://cgiouw.ru/?cgiouw cgiouw https://cdgijlnw.biz/?cdgijlnw cdgijlnw https://gmuvw.biz/?gmuvw gmuvw https://abdefgiu.info/?abdefgiu abdefgiu https://gkmqwy.info/?gkmqwy gkmqwy http://bfgkqstw.com/?bfgkqstw bfgkqstw https://bfgmnovx.net/?bfgmnovx bfgmnovx https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx chiuwx http://chlvw.ua/?chlvw chlvw https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv bghosv https://befgkpst.net/?befgkpst befgkpst http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw bfjorw http://cfijuz.biz/?cfijuz cfijuz https://efjlnuv.net/?efjlnuv efjlnuv https://hkruvy.net/?hkruvy hkruvy http://defnswxy.biz/?defnswxy defnswxy https://fgjmqxy.biz/?fgjmqxy fgjmqxy https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy https://chjmosxy.biz/?chjmosxy chjmosxy https://abcdhln.net/?abcdhln abcdhln https://elstuyz.biz/?elstuyz elstuyz https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv bcdgquv http://afiklst.com/?afiklst afiklst https://agnvwz.info/?agnvwz agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx https://bdjmnuz.biz/?bdjmnuz bdjmnuz https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt egklqt http://bimqr.com/?bimqr bimqr https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz dvwxz https://dfghimu.com/?dfghimu dfghimu https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu http://eflntvyz.ru/?eflntvyz eflntvyz https://abcdpsx.ru/?abcdpsx abcdpsx https://cfipqx.info/?cfipqx cfipqx http://adforst.biz/?adforst adforst https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv http://cfkmnorv.ua/?cfkmnorv cfkmnorv http://dhjkuv.info/?dhjkuv dhjkuv https://bcejlsyz.info/?bcejlsyz bcejlsyz https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy http://agjloqy.biz/?agjloqy agjloqy https://dglmnpqy.com/?dglmnpqy dglmnpqy https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn bikmn http://abcovwy.ua/?abcovwy abcovwy https://adfgioqx.ua/?adfgioqx adfgioqx http://beijknpq.info/?beijknpq beijknpq https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux cgijqrux https://fklnovy.biz/?fklnovy fklnovy http://chkmosux.net/?chkmosux chkmosux https://cenprvw.info/?cenprvw cenprvw https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy bdgopy https://ampwx.info/?ampwx ampwx http://cefmoqtx.info/?cefmoqtx cefmoqtx https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux abqtux https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv agiluv https://befimsu.biz/?befimsu befimsu https://abcdeiry.info/?abcdeiry abcdeiry http://bcqrxz.info/?bcqrxz bcqrxz https://eilntvxz.net/?eilntvxz eilntvxz https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox adglox https://adjnptw.ru/?adjnptw adjnptw http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru abeqru https://cnoprsw.biz/?cnoprsw cnoprsw http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz lmntwxz https://blopqxz.ua/?blopqxz blopqxz https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw hinsw http://abfjklsy.net/?abfjklsy abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx http://dejnotz.ru/?dejnotz dejnotz https://bcflsxz.info/?bcflsxz bcflsxz http://hklmsux.net/?hklmsux hklmsux https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy cfknxy http://acfntvx.net/?acfntvx acfntvx https://cdfksuvy.info/?cdfksuvy cdfksuvy https://abjkorx.ru/?abjkorx abjkorx https://afiklqvw.com/?afiklqvw afiklqvw http://adnovx.biz/?adnovx adnovx https://fglstyz.info/?fglstyz fglstyz https://ghikruy.biz/?ghikruy ghikruy http://ghnopsvz.biz/?ghnopsvz ghnopsvz http://abcgjnrw.com/?abcgjnrw abcgjnrw https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv ahiqrv https://ajotw.biz/?ajotw ajotw http://bfhipuw.info/?bfhipuw bfhipuw http://acjmqry.ua/?acjmqry acjmqry http://efjnoqrz.info/?efjnoqrz efjnoqrz https://achkqst.info/?achkqst achkqst http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps eimps https://abdstwz.net/?abdstwz abdstwz https://cdhjlu.biz/?cdhjlu cdhjlu http://gmrstyz.info/?gmrstyz gmrstyz https://djsvwyz.ua/?djsvwyz djsvwyz http://begjmorv.net/?begjmorv begjmorv http://ahlmoptz.biz/?ahlmoptz ahlmoptz http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs http://cjklmr.biz/?cjklmr cjklmr https://aefgpquv.biz/?aefgpquv aefgpquv https://jlmopswx.net/?jlmopswx jlmopswx https://bfhjlsw.biz/?bfhjlsw bfhjlsw https://iknortux.com/?iknortux iknortux http://abhkoprv.biz/?abhkoprv abhkoprv https://cjnrwyz.ua/?cjnrwyz cjnrwyz http://djmnrvwz.info/?djmnrvwz djmnrvwz http://adgkrtyz.info/?adgkrtyz adgkrtyz https://bdhjsuv.ua/?bdhjsuv bdhjsuv http://acfgjor.biz/?acfgjor acfgjor http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz ejlvwz http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.info/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv efilotuv http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz http://aefjlyz.net/?aefjlyz aefjlyz https://ceijorz.info/?ceijorz ceijorz http://dkmnuvx.info/?dkmnuvx dkmnuvx https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz dmorz https://celmrwz.info/?celmrwz celmrwz https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz cefjkxz http://behjlv.biz/?behjlv behjlv http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijlnqrs.biz/?gijlnqrs gijlnqrs http://adfjlrsz.info/?adfjlrsz adfjlrsz https://afiptvz.biz/?afiptvz afiptvz http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx abcltx https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu https://akmotuv.info/?akmotuv akmotuv http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz bimxz http://afjoqyz.ru/?afjoqyz afjoqyz https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw bmptw http://fgltwx.com/?fgltwx fgltwx http://chlnpqrs.com/?chlnpqrs chlnpqrs https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw adfhmw http://bgilt.com/?bgilt bgilt https://cefotu.biz/?cefotu cefotu https://adfruvwy.biz/?adfruvwy adfruvwy https://fgoqstu.info/?fgoqstu fgoqstu https://cdilnqvy.info/?cdilnqvy cdilnqvy https://adghruw.ua/?adghruw adghruw https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy http://abfijlnr.info/?abfijlnr abfijlnr https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ciknpquy ciknpquy https://egijstw.biz/?egijstw egijstw http://cdikopvz.info/?cdikopvz cdikopvz https://egjklnvy.info/?egjklnvy egjklnvy https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv dfkmv https://efgloptu.ru/?efgloptu efgloptu https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv efhklv https://befmrwy.info/?befmrwy befmrwy https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu https://fjrsty.info/?fjrsty fjrsty http://behimuwy.ru/?behimuwy behimuwy https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors blors http://dnqry.info/?dnqry dnqry https://cghnrux.biz/?cghnrux cghnrux http://dfinqtwx.ru/?dfinqtwx dfinqtwx http://efotxz.info/?efotxz efotxz http://aceglpqu.net/?aceglpqu aceglpqu https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp cdfkp https://aelouvz.ua/?aelouvz aelouvz http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw cejqw https://bdgkuy.com/?bdgkuy bdgkuy https://hkoprvxy.ru/?hkoprvxy hkoprvxy https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty cfgsty https://bfgptx.net/?bfgptx bfgptx http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy deilmoy https://kqruv.info/?kqruv kqruv https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy https://abcmnu.net/?abcmnu abcmnu http://adehinxy.ru/?adehinxy adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux http://fhikpqvy.net/?fhikpqvy fhikpqvy https://bejkqu.info/?bejkqu bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy http://bcdhnopx.info/?bcdhnopx bcdhnopx https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw bdeftvw http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw http://cdehmquy.biz/?cdehmquy cdehmquy https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz https://abesty.info/?abesty abesty https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv acdlv http://fgimqw.net/?fgimqw fgimqw https://fgkrxz.info/?fgkrxz fgkrxz http://abfhnvxy.ua/?abfhnvxy abfhnvxy https://bcdeipsy.ru/?bcdeipsy bcdeipsy http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry fnpqry https://aklstwx.ru/?aklstwx aklstwx http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv dkmsuv http://afinptuw.ru/?afinptuw afinptuw https://bijnqvx.biz/?bijnqvx bijnqvx https://behlxz.info/?behlxz behlxz http://bgjkquyz.biz/?bgjkquyz bgjkquyz http://aehnpqw.net/?aehnpqw aehnpqw http://cghjmpqu.com/?cghjmpqu cghjmpqu http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost bfgjmost https://bhnou.ua/?bhnou bhnou https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx abijmrtx http://acefnv.com/?acefnv acefnv http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz egjltuz http://afghpsv.net/?afghpsv afghpsv http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekrsu bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlprw.net/?ceijlprw ceijlprw https://ijlqsw.net/?ijlqsw ijlqsw https://einrvxz.net/?einrvxz einrvxz https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox bchnox https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx gjnrx http://abcsvz.com/?abcsvz abcsvz http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz fkmouxz http://bcpsx.com/?bcpsx bcpsx http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq bcdepq http://kqswy.net/?kqswy kqswy https://bdgjnuy.ru/?bdgjnuy bdgjnuy https://cgiklsvz.info/?cgiklsvz cgiklsvz https://bfgnprwy.net/?bfgnprwy bfgnprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy kmrvwy https://eglotuv.net/?eglotuv eglotuv http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy https://gjoquvwx.net/?gjoquvwx gjoquvwx https://abcfouw.biz/?abcfouw abcfouw https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv mqruv https://afklmu.biz/?afklmu afklmu https://abcgptyz.biz/?abcgptyz abcgptyz https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx hlnvx http://adiklmp.com/?adiklmp adiklmp https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw acmopvw http://cdmuw.ua/?cdmuw cdmuw http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw http://bcfhmsz.net/?bcfhmsz bcfhmsz http://bcgknpry.biz/?bcgknpry bcgknpry https://bflmnoq.info/?bflmnoq bflmnoq https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw gknosuw http://ajoruvz.com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx deintwx https://eghijkuy.com/?eghijkuy eghijkuy http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz https://deklmuvw.com/?deklmuvw deklmuvw http://cfgimox.biz/?cfgimox cfgimox http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy acilsy http://bfhlnop.net/?bfhlnop bfhlnop https://fijluz.net/?fijluz fijluz https://dfjnostv.ua/?dfjnostv dfjnostv https://aginsty.net/?aginsty aginsty http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu agopu http://bdkopst.ua/?bdkopst bdkopst https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz http://ijlmuvw.com/?ijlmuvw ijlmuvw https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.info/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw https://aftuxy.ua/?aftuxy aftuxy https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijqrv.com/?fijqrv fijqrv https://ahmquvw.info/?ahmquvw ahmquvw https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy https://dghnqrty.ru/?dghnqrty dghnqrty https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz http://bcdeghvx.ru/?bcdeghvx bcdeghvx http://bdfgruvy.com/?bdfgruvy bdfgruvy https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz http://clqruwxy.com/?clqruwxy clqruwxy https://fhikoq.net/?fhikoq fhikoq https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz http://fhjkor.biz/?fhjkor fhjkor https://dfgkouyz.com/?dfgkouyz dfgkouyz https://acfhmqtz.info/?acfhmqtz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu dfgnrtu https://diknswx.biz/?diknswx diknswx https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv ablmv http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr ghmqr https://cdetv.ru/?cdetv cdetv https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv djopv http://aiknorw.biz/?aiknorw aiknorw https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv dfhquv https://bfgjoq.com/?bfgjoq bfgjoq https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz https://ghijkstx.info/?ghijkstx ghijkstx http://abilmsz.info/?abilmsz abilmsz https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegnwx.biz/?abegnwx abegnwx http://abejnsx.net/?abejnsx abejnsx https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv http://bdfostyz.net/?bdfostyz bdfostyz https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw afgruvw https://cefivyz.com/?cefivyz cefivyz https://gmqrt.com/?gmqrt gmqrt https://acdimqrw.info/?acdimqrw acdimqrw http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx adfjstx https://cghlqt.net/?cghlqt cghlqt http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv ahluv http://acdfgn.net/?acdfgn acdfgn https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy ghnuvy http://aeiky.com/?aeiky aeiky https://bdilnov.info/?bdilnov bdilnov https://cjlnpqru.biz/?cjlnpqru cjlnpqru https://cgjqtv.biz/?cgjqtv cgjqtv https://abcnsvz.biz/?abcnsvz abcnsvz http://abegopsu.biz/?abegopsu abegopsu https://bcmstuvy.net/?bcmstuvy bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diotvyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw https://cdghlmnz.info/?cdghlmnz cdghlmnz https://bcgijmow.net/?bcgijmow bcgijmow http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt bhjnrt https://eimoptu.ua/?eimoptu eimoptu https://achptv.info/?achptv achptv https://boptux.biz/?boptux boptux http://acmovz.ua/?acmovz acmovz https://abfmnpw.info/?abfmnpw abfmnpw https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps ehnps https://dnpuw.biz/?dnpuw dnpuw https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz fmuyz http://bejkvz.info/?bejkvz bejkvz https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw abdjsw https://abelrvx.biz/?abelrvx abelrvx https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz http://eghkmsux.net/?eghkmsux eghkmsux https://acehlrsv.info/?acehlrsv acehlrsv https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu cefmtu https://hjnqrt.net/?hjnqrt hjnqrt http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr cdfqr http://adelmvz.ru/?adelmvz adelmvz http://abflnsvx.com/?abflnsvx abflnsvx http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz http://cdflstwz.net/?cdflstwz cdflstwz https://egjostux.ua/?egjostux egjostux https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz bcfgmz http://dfnouz.ua/?dfnouz dfnouz http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl dijkl http://aeglu.com/?aeglu aeglu http://dgikmpy.info/?dgikmpy dgikmpy https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu acfglopu https://cdjqx.net/?cdjqx cdjqx https://aefimst.biz/?aefimst aefimst http://bfikmuw.net/?bfikmuw bfikmuw http://abhioqvw.ua/?abhioqvw abhioqvw http://abdilmo.net/?abdilmo abdilmo https://bhlovx.info/?bhlovx bhlovx https://fgvwxz.info/?fgvwxz fgvwxz http://afhjpqsv.biz/?afhjpqsv afhjpqsv http://deilrsv.com/?deilrsv deilrsv http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw bfkmtw https://adkxy.ru/?adkxy adkxy https://blpqxy.com/?blpqxy blpqxy https://efgilor.net/?efgilor efgilor https://abcdky.info/?abcdky abcdky https://befgikqu.ru/?befgikqu befgikqu https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu fjlsu https://kpquw.com/?kpquw kpquw https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy http://dijopsz.biz/?dijopsz dijopsz http://acfhrsz.ua/?acfhrsz acfhrsz https://bfhrstvy.ua/?bfhrstvy bfhrstvy http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz hiotz https://abefmswx.biz/?abefmswx abefmswx http://acgkopyz.ua/?acgkopyz acgkopyz https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz befmquwz https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx http://beimrty.net/?beimrty beimrty http://dgnsvwy.ua/?dgnsvwy dgnsvwy https://aghjstvw.com/?aghjstvw aghjstvw http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop finop http://bcdhsy.biz/?bcdhsy bcdhsy http://cfhmnvy.net/?cfhmnvy cfhmnvy https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou akmou https://fgkntuw.biz/?fgkntuw fgkntuw https://adfjsw.com/?adfjsw adfjsw https://bdjmrsty.biz/?bdjmrsty bdjmrsty http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz dmnrz http://deglqtu.com/?deglqtu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw https://gnoqvxz.com/?gnoqvxz gnoqvxz https://hijkmsu.ru/?hijkmsu hijkmsu http://cistxyz.biz/?cistxyz cistxyz https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy hmqrwy https://crsuv.ru/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz http://abjpqu.ua/?abjpqu abjpqu https://ehjkopuz.info/?ehjkopuz ehjkopuz http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx emntvwx http://eghksx.info/?eghksx eghksx https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv cinopuv http://efijnptz.biz/?efijnptz efijnptz http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw denprw http://adglpwz.net/?adglpwz adglpwz http://dehimosu.biz/?dehimosu dehimosu https://abchprwy.com/?abchprwy abchprwy http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw https://emnprwyz.info/?emnprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx gklmvx http://behlpstz.com/?behlpstz behlpstz https://efginz.com/?efginz efginz https://deilmnru.biz/?deilmnru deilmnru http://bdehoqux.ru/?bdehoqux bdehoqux http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw https://cfmtvy.net/?cfmtvy cfmtvy http://agkmqvxy.biz/?agkmqvxy agkmqvxy https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy bgimy https://acdfhimy.info/?acdfhimy acdfhimy http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux lnopux https://iknqstux.ua/?iknqstux iknqstux https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns abfklmns https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz adikquz http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw https://acdfnrz.ua/?acdfnrz acdfnrz https://jmopqtw.info/?jmopqtw jmopqtw https://abdeghtv.biz/?abdeghtv abdeghtv http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy forvy https://abgjknpv.com/?abgjknpv abgjknpv http://bcdgimr.info/?bcdgimr bcdgimr https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz https://binpquxy.com/?binpquxy binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw ehkmnw http://kmsxy.ua/?kmsxy kmsxy http://benotvw.info/?benotvw benotvw https://dhinops.biz/?dhinops dhinops https://hklnpqtv.ru/?hklnpqtv hklnpqtv https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp cdghjp http://abchnoqx.com/?abchnoqx abchnoqx http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy acdnqwy http://befilpv.com/?befilpv befilpv http://fijrux.biz/?fijrux fijrux https://aknqrtuz.ru/?aknqrtuz aknqrtuz http://abfoqwz.ru/?abfoqwz abfoqwz https://defikv.info/?defikv defikv http://abcoqtz.com/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz http://bdfipqux.net/?bdfipqux bdfipqux http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl http://ahikmpqt.info/?ahikmpqt ahikmpqt https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx http://gknquvw.info/?gknquvw gknquvw https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy fhopy https://bcmpux.net/?bcmpux bcmpux https://adfmrsuy.info/?adfmrsuy adfmrsuy https://fmtuvx.net/?fmtuvx fmtuvx https://defjmswy.net/?defjmswy defjmswy https://ahijmrsz.info/?ahijmrsz ahijmrsz https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz https://behioqyz.ru/?behioqyz behioqyz https://admrsvxz.com/?admrsvxz admrsvxz https://cefhipqw.com/?cefhipqw cefhipqw http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http://efgnrsux.net/?efgnrsux efgnrsux http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs chilnqrs https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr dfghilr http://bdlqstv.com/?bdlqstv bdlqstv http://adhjnrsz.com/?adhjnrsz adhjnrsz http://adiopqx.com/?adiopqx adiopqx http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx dgpsx http://afgrt.ru/?afgrt afgrt http://bghipuvz.ua/?bghipuvz bghipuvz http://egjmvw.com/?egjmvw egjmvw https://eginoswz.com/?eginoswz eginoswz http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy kruwy http://fhlqty.net/?fhlqty fhlqty http://chpuvyz.ru/?chpuvyz chpuvyz https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw cklsw https://fghkmvy.info/?fghkmvy fghkmvy http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz cdjstuz https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv https://inuvwyz.ru/?inuvwyz inuvwyz https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt defrt https://afghsvy.ua/?afghsvy afghsvy http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz aijvxz https://abdhrsx.net/?abdhrsx abdhrsx http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow fjkow https://fkruvx.biz/?fkruvx fkruvx http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops djkmops https://cequyz.ru/?cequyz cequyz https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps http://dhmostwy.net/?dhmostwy dhmostwy https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgikwy abcgikwy https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv https://adgnowz.ua/?adgnowz adgnowz http://cfjnorty.ru/?cfjnorty cfjnorty https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu https://fgjoquvz.ru/?fgjoquvz fgjoquvz http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry cdikry https://bfhkrty.info/?bfhkrty bfhkrty https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu https://acegjlmo.net/?acegjlmo acegjlmo http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv dpqrsv https://acehklps.ua/?acehklps acehklps http://aghsuvyz.biz/?aghsuvyz aghsuvyz https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz https://demnpst.biz/?demnpst demnpst https://bghikms.biz/?bghikms bghikms http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy.com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux egkmosux https://dimoruy.com/?dimoruy dimoruy https://ehopyz.net/?ehopyz ehopyz https://ehijptuy.com/?ehijptuy ehijptuy http://jklmprtv.ua/?jklmprtv jklmprtv http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu abcfqu https://degijyz.ru/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt bfgknpqt https://afmqtv.ru/?afmqtv afmqtv https://bertuvwz.info/?bertuvwz bertuvwz https://eipuvwxy.com/?eipuvwxy eipuvwxy http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz bktxz http://adegmovy.biz/?adegmovy adegmovy http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino adino http://djlmouv.ua/?djlmouv djlmouv http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz bjstxz https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy http://bcemosuw.info/?bcemosuw bcemosuw https://cgklpxy.ua/?cgklpxy cgklpxy https://deghijrs.net/?deghijrs deghijrs http://cgjknvz.ru/?cgjknvz cgjknvz https://bcikmswz.net/?bcikmswz bcikmswz https://cginosuy.com/?cginosuy cginosuy http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz https://dfglmp.ru/?dfglmp dfglmp https://cfgmnpru.ua/?cfgmnpru cfgmnpru https://cefhlns.ru/?cefhlns cefhlns http://bcelnosz.ua/?bcelnosz bcelnosz https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz ipuxyz https://dfghintv.net/?dfghintv dfghintv https://abdhkuxy.com/?abdhkuxy abdhkuxy https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz http://cefkqrx.info/?cefkqrx cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz cfhostz http://jnqsv.ua/?jnqsv jnqsv http://abcfjmps.net/?abcfjmps abcfjmps https://jlnptu.com/?jlnptu jlnptu https://bdghklns.ua/?bdghklns bdghklns https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy http://cemnsvz.net/?cemnsvz cemnsvz http://cfhknu.info/?cfhknu cfhknu http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz http://fghjloqu.com/?fghjloqu fghjloqu https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz imvwyz http://dhklntw.com/?dhklntw dhklntw http://adijnouy.info/?adijnouy adijnouy https://cdervx.info/?cdervx cdervx https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy ilmouxy http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr adgoqr http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx eglpqrx http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw aefhprtw https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv befhirv https://achkryz.com/?achkryz achkryz https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy ejloqy https://biopqz.net/?biopqz biopqz https://bfhklotx.biz/?bfhklotx bfhklotx https://dgjkqwy.ua/?dgjkqwy dgjkqwy https://afnpqrxz.net/?afnpqrxz afnpqrxz https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz cruwz https://bknovz.net/?bknovz bknovz http://dhilrtv.ua/?dhilrtv dhilrtv http://chjpqvyz.info/?chjpqvyz chjpqvyz https://achijmr.com/?achijmr achijmr https://chknosty.ru/?chknosty chknosty https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy defklosy http://kmqrw.com/?kmqrw kmqrw https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux cefkrux https://aefhivx.ua/?aefhivx aefhivx https://dijlrtuy.ua/?dijlrtuy dijlrtuy https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr Hey cute. I?ve seen, met you in night club last weekend but I could not come . You?re hot. My pictures, photos are on that website -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailinglists at xgm.de Thu Nov 29 02:12:58 2018 From: mailinglists at xgm.de (Florian Lindner) Date: Thu, 29 Nov 2018 09:12:58 +0100 Subject: [petsc-users] IS Invert Not Permutation In-Reply-To: References: <63cd6b99-5b2a-1b61-a006-d9b96bde9830@xgm.de> <67b84dd1-1e2a-a06b-c5d7-d4d5697ef1bc@xgm.de> Message-ID: <25a69b1e-3aca-fb53-79ff-8062849e29ab@xgm.de> Am 28.11.18 um 18:12 schrieb Matthew Knepley: > On Wed, Nov 28, 2018 at 12:06 PM Florian Lindner > wrote: > > Hey, > > thanks for your quick reply! > > As far as I understand how MatSetLocalToGlobalMapping works, there is no way to create such a mapping that can be used for MatSetValuesLocal? > > What I need is basically > > MatSetValuesLocal row/col argument = 3 / 4 / 6 maps to local row/col 0 / 1 / 2 > > > You are inverting the meaning of MatSetValuesLocal(). It takes in local indices (0, 1, 2) and sets the > values into a global matrix using global indices (3, 4, 6) which it gets from a compact table. > > It you have global indices (3, 4, 6) use MatSetValues() directly. > > Why are you trying to convert global indices to local indices, which requires a search. Hey, we are building a matrix for RBF interpolation. Given two sets of vertices, this requires to evaluate a function on every pair of vertices. The mesh is decomposed. The vertices have a globally unique, but not necessarily continous globalIndex. A rank 1 has the local portion of the mesh with the globalIndizes = {3, 4, 6}. | (1,1) (1,2) | (1,3) (1,4) (1,6) (1,7) (1,8) (off rank row) | (2,1) (2,2) | (2,3) (2,4) (2,6) (2,7) (3,8) (off rank row) -- (3,1) (3,2) | (3,3) (3,4) (3,6) | (3,7) (3,8) (4,1) (4,2) | (4,3) (4,4) (4,6) | (4,7) (4,8) (6,1) (6,2) | (6,3) (6,4) (6,6) | (6,7) (6,8) -- (more off rank rows) (marked processor boundaries with | and --) Going through local vertices, I could just count and use MatSetValuesLocal without a mapping and set the rows as {0, 1, 2}. But for the cols I need a global enumeration. Just counting is not an option. While the locally owned vertices are disjoint on each rank, each rank has some additional (halo like) vertices vertices. It does not own these (it doesn't build rows for them) but uses them building rows for its own vertices). The size of these halos depend on the basis function used. I can't use globalIndex directly, as it may contain holes and so on. For that I go through all local vertices and collect their globalIndex in an std::vector. My original implementation was to create an IS from the globalIndizes, ISInvertPermutation, ISAllGather and ISLocalToGlobalMappingCreateIS. My problem is, that InvertPermutation does not allow the IS to have holes in it. Following your advice, I use an AO instead and translate the rowIdx (that maybe I can omit) and the colIdx using AOApplicationToPetsc before calling MatSetValues, see below. Any thoughts on that matter are deeply appreciated! Best, Florian For that, my original implementation was to go ro > > ? Thanks, > > ? ? Matt > ? > > I seems to me, that if I set an index set with MatSetLocalToGlobalMapping like {3, 4, 6} it does the opposite. i.e. maps local {0, 1, 2] to global {3, 4, 6} > > Therefore I probably need to create my own translation of {3, 4, 6} to {0, 1, 2} > > A first sketch looks like that > > mapping[rank] = {3, 4, 6} > > for (auto i : mapping[rank]) { > ? ? for (auto j : mapping[rank]) { > ? ? ? cout << "Setting " << i << ", " << j << endl; > ? ? ? PetscScalar v[] = {i*10.0 + j}; > ? ? ? PetscInt rows[] = {i}; > ? ? ? PetscInt cols[] = {j}; > ? ? ? AOApplicationToPetsc(ao, 1, rows); > ? ? ? AOApplicationToPetsc(ao, 1, cols); > ? ? ? cout << "Really setting " << rows[0] << ", " << cols[0] << endl << endl; > ? ? ? ierr = MatSetValues(matrix, 1, rows, 1, cols, v, INSERT_VALUES); CHKERRQ(ierr); > ? ? } > ? } > > which seems to work so far. > > Any comments from your side are appreciated! > > Of course, I should set as much values as possible using MatSetValues... > > Best, > Florian > > > Am 27.11.18 um 16:33 schrieb Matthew Knepley: > > On Tue, Nov 27, 2018 at 10:17 AM Florian Lindner via petsc-users >> wrote: > > > >? ? ?Hello, > > > >? ? ?I have a range of local input data indices that I want to use for row indexing, say { 3, 4, 6}. > > > >? ? ?For that, I create a matrix with a local number of rows of 3 and map the indices {3, 4, 5} to these rows. > > > > > > It seems like you are trying to create a mapping and its inverse. We do that in? > > > > ??https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/AOCreateMapping.html > > > > Note that this is not really scalable. Usually there is a way to restructure the code to avoid this. > > > > ? Thanks, > > > > ? ? Matt > > ? > > > >? ? ?I create an index set: > > > >? ? ?ISCreateGeneral(comm, myIndizes.size(), myIndizes.data(), PETSC_COPY_VALUES, &ISlocal); > >? ? ?ISSetPermutation(ISlocal); > >? ? ?ISInvertPermutation(ISlocal, myIndizes.size(), &ISlocalInv); > >? ? ?ISAllGather(ISlocalInv, &ISglobal); // Gather the IS from all processors > >? ? ?ISLocalToGlobalMappingCreateIS(ISglobal, &ISmapping); // Make it a mapping > > > >? ? ?MatSetLocalToGlobalMapping(matrix, ISmapping, ISmapping); // Set mapping for rows and cols > > > >? ? ?The InvertPermutation is required because, well, it seems that the direction the mapping stuff works in PETSc. > > > >? ? ?Now I can use MatSetValues local to set rows {3, 4, 5} and actually set the local rows {1, 2, 3}. > > > >? ? ?The problem is that the range of data vertices is not always contiguous, i.e. could be {3, 4, 6}. This seems to be not a permutation of PETSc, therefore ISSetPermutation fails and subsequently ISInvertPermutation. > > > >? ? ?How can I still create such a mapping, so that: > > > >? ? ?3 -> 1 > >? ? ?4 -> 2 > >? ? ?6 -> 6 > > > >? ? ?row 5 is unassigned and will never be addressed. > > > >? ? ?Thanks! > >? ? ?Florian > > > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From edoardo.alinovi at gmail.com Thu Nov 29 03:12:09 2018 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 29 Nov 2018 10:12:09 +0100 Subject: [petsc-users] Compile petsc using intel mpi In-Reply-To: References: Message-ID: Hello guys, thank you very much for you suggestions and sorry for getting back you late. Unfortunately, actually my attempts to compile with intel are not successful. Here my command: sudo ./configure --prefix=/home/edo/software/petsc-3.10.2 PETSC_ARCH=arch-intel-opt --with-cc=mpiicc --with-cxx=mpiiccp --with-fc=mpiifort FOPTFLAGS='-O3' COPTFLAGS='-O3' CXXOPTFLAGS='-O3' --with-blas-lapack-dir=/home/edo/intel/mkl/lib/intel64/ --with-debugging=no --download-fblaslapack=1 --download-superlu_dist --download-mumps --download-hypre --download-metis --download-parmetis The log file states that: ------------------ LOG ------------------ TEST checkCCompiler from config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:587) Locate a functional C compiler Checking for program /usr/sbin/mpiicc...not found Checking for program /usr/bin/mpiicc...not found Checking for program /sbin/mpiicc...not found Checking for program /bin/mpiicc...not found Checking for program /home/edo/software/petsc_intel/lib/petsc/bin/win32fe/mpiicc...not found ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- C compiler you provided with -with-cc=mpiicc does not work. ------------------------------------ It seems that petsc is not able to find mpiicc. However, the path to intel is well defined in my .bashrc and I can easily compile a test file with those compiler. if I put the full path in --with-cc= ... then I get: ------------------ LOG ------------------ TEST checkCCompiler from config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:587) Locate a functional C compiler Checking for program /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc...found Defined make macro "CC" to "/home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc" Pushing language C All intermediate test results are stored in /tmp/petsc-ZLGfap All intermediate test results are stored in /tmp/petsc-ZLGfap/config.setCompilers Executing: /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc -c -o /tmp/petsc-ZLGfap/config.setCompilers/conftest.o -I/tmp/petsc-ZLGfap/config.setCompilers /tmp/petsc-ZLGfap/config.setCompilers/conftest.c Possible ERROR while running compiler: exit code 32512 stderr: /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc: line 557: icc: command not found Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Popping language C Error testing C compiler: Cannot compile C with /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc. Deleting "CC" ------------------------------------ Do you see any error in my setup? Thank you very much! ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica, Universita' degli Studi di Genova, 1, via Montallegro, 16145 Genova, Italy Il giorno mar 27 nov 2018 alle ore 17:16 Matthew Knepley ha scritto: > On Tue, Nov 27, 2018 at 6:25 AM Edoardo alinovi via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Dear users, >> >> I have installed intel parallel studio on my workstation and thus I >> would like to take advantage of intel compiler. >> >> Before messing up my installation, have you got some guidelines to >> survive at this attempt? I have found here in the mailing list the >> following instructions: >> >> --with-cc=icc --with-fc=ifort --with-mpi-include=/path-to-intel >> --with-mpi-lib=/path-to-intel >> > > Yes, this looks right. > > >> Are they correct? >> >> Also I have an already existing and clean installation of petsc using >> openmpi. I would like to retain this installtion since it is working very >> well and switching between the two somehow. Any tips on this? >> > > When you do the Intel configuration, use a different name like > --PETSC_ARCH=arch-intelps-opt > > Thanks, > > Matt > > >> I will never stop to say thank you for your precious support! >> >> Edoardo >> >> ------ >> >> Edoardo Alinovi, Ph.D. >> >> DICCA, Scuola Politecnica, >> Universita' degli Studi di Genova, >> 1, via Montallegro, >> 16145 Genova, Italy >> >> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 29 06:35:16 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 29 Nov 2018 07:35:16 -0500 Subject: [petsc-users] IS Invert Not Permutation In-Reply-To: <25a69b1e-3aca-fb53-79ff-8062849e29ab@xgm.de> References: <63cd6b99-5b2a-1b61-a006-d9b96bde9830@xgm.de> <67b84dd1-1e2a-a06b-c5d7-d4d5697ef1bc@xgm.de> <25a69b1e-3aca-fb53-79ff-8062849e29ab@xgm.de> Message-ID: On Thu, Nov 29, 2018 at 3:12 AM Florian Lindner wrote: > > > Am 28.11.18 um 18:12 schrieb Matthew Knepley: > > On Wed, Nov 28, 2018 at 12:06 PM Florian Lindner > wrote: > > > > Hey, > > > > thanks for your quick reply! > > > > As far as I understand how MatSetLocalToGlobalMapping works, there > is no way to create such a mapping that can be used for MatSetValuesLocal? > > > > What I need is basically > > > > MatSetValuesLocal row/col argument = 3 / 4 / 6 maps to local row/col > 0 / 1 / 2 > > > > > > You are inverting the meaning of MatSetValuesLocal(). It takes in local > indices (0, 1, 2) and sets the > > values into a global matrix using global indices (3, 4, 6) which it gets > from a compact table. > > > > It you have global indices (3, 4, 6) use MatSetValues() directly. > > > > Why are you trying to convert global indices to local indices, which > requires a search. > > Hey, > > we are building a matrix for RBF interpolation. Given two sets of > vertices, this requires to evaluate a function on every pair of vertices. > The mesh is decomposed. The vertices have a globally unique, but not > necessarily continous globalIndex. > > A rank 1 has the local portion of the mesh with the globalIndizes = {3, 4, > 6}. > > | (1,1) (1,2) | (1,3) (1,4) (1,6) (1,7) (1,8) (off rank row) > | (2,1) (2,2) | (2,3) (2,4) (2,6) (2,7) (3,8) (off rank row) > -- > (3,1) (3,2) | (3,3) (3,4) (3,6) | (3,7) (3,8) > (4,1) (4,2) | (4,3) (4,4) (4,6) | (4,7) (4,8) > (6,1) (6,2) | (6,3) (6,4) (6,6) | (6,7) (6,8) > -- > (more off rank rows) > > (marked processor boundaries with | and --) > > Going through local vertices, I could just count and use MatSetValuesLocal > without a mapping and set the rows as {0, 1, 2}. > > But for the cols I need a global enumeration. Just counting is not an > option. While the locally owned vertices are disjoint on each rank, each > rank has some additional (halo like) vertices vertices. It does not own > these (it doesn't build rows for them) but uses them building rows for its > own vertices). The size of these halos depend on the basis function used. I > can't use globalIndex directly, as it may contain holes and so on. > > For that I go through all local vertices and collect their globalIndex in > an std::vector. My original implementation was to create an IS from the > globalIndizes, ISInvertPermutation, ISAllGather and > ISLocalToGlobalMappingCreateIS. > > My problem is, that InvertPermutation does not allow the IS to have holes > in it. > > Following your advice, I use an AO instead and translate the rowIdx (that > maybe I can omit) and the colIdx using AOApplicationToPetsc before calling > MatSetValues, see below. > > Any thoughts on that matter are deeply appreciated! > Renumber the vertices to be contiguous. This is what we do when repartitioning meshes. Thanks, Matt > Best, > Florian > > > > For that, my original implementation was to go ro > > > > > Thanks, > > > > Matt > > > > > > I seems to me, that if I set an index set with > MatSetLocalToGlobalMapping like {3, 4, 6} it does the opposite. i.e. maps > local {0, 1, 2] to global {3, 4, 6} > > > > Therefore I probably need to create my own translation of {3, 4, 6} > to {0, 1, 2} > > > > A first sketch looks like that > > > > mapping[rank] = {3, 4, 6} > > > > for (auto i : mapping[rank]) { > > for (auto j : mapping[rank]) { > > cout << "Setting " << i << ", " << j << endl; > > PetscScalar v[] = {i*10.0 + j}; > > PetscInt rows[] = {i}; > > PetscInt cols[] = {j}; > > AOApplicationToPetsc(ao, 1, rows); > > AOApplicationToPetsc(ao, 1, cols); > > cout << "Really setting " << rows[0] << ", " << cols[0] << > endl << endl; > > ierr = MatSetValues(matrix, 1, rows, 1, cols, v, > INSERT_VALUES); CHKERRQ(ierr); > > } > > } > > > > which seems to work so far. > > > > Any comments from your side are appreciated! > > > > Of course, I should set as much values as possible using > MatSetValues... > > > > Best, > > Florian > > > > > > Am 27.11.18 um 16:33 schrieb Matthew Knepley: > > > On Tue, Nov 27, 2018 at 10:17 AM Florian Lindner via petsc-users < > petsc-users at mcs.anl.gov petsc-users at mcs.anl.gov >> wrote: > > > > > > Hello, > > > > > > I have a range of local input data indices that I want to use > for row indexing, say { 3, 4, 6}. > > > > > > For that, I create a matrix with a local number of rows of 3 > and map the indices {3, 4, 5} to these rows. > > > > > > > > > It seems like you are trying to create a mapping and its inverse. > We do that in > > > > > > > https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/AOCreateMapping.html > > > > > > Note that this is not really scalable. Usually there is a way to > restructure the code to avoid this. > > > > > > Thanks, > > > > > > Matt > > > > > > > > > I create an index set: > > > > > > ISCreateGeneral(comm, myIndizes.size(), myIndizes.data(), > PETSC_COPY_VALUES, &ISlocal); > > > ISSetPermutation(ISlocal); > > > ISInvertPermutation(ISlocal, myIndizes.size(), &ISlocalInv); > > > ISAllGather(ISlocalInv, &ISglobal); // Gather the IS from all > processors > > > ISLocalToGlobalMappingCreateIS(ISglobal, &ISmapping); // Make > it a mapping > > > > > > MatSetLocalToGlobalMapping(matrix, ISmapping, ISmapping); // > Set mapping for rows and cols > > > > > > The InvertPermutation is required because, well, it seems that > the direction the mapping stuff works in PETSc. > > > > > > Now I can use MatSetValues local to set rows {3, 4, 5} and > actually set the local rows {1, 2, 3}. > > > > > > The problem is that the range of data vertices is not always > contiguous, i.e. could be {3, 4, 6}. This seems to be not a permutation of > PETSc, therefore ISSetPermutation fails and subsequently > ISInvertPermutation. > > > > > > How can I still create such a mapping, so that: > > > > > > 3 -> 1 > > > 4 -> 2 > > > 6 -> 6 > > > > > > row 5 is unassigned and will never be addressed. > > > > > > Thanks! > > > Florian > > > > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ < > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ < > http://www.cse.buffalo.edu/~knepley/> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 29 07:10:09 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 29 Nov 2018 08:10:09 -0500 Subject: [petsc-users] Compile petsc using intel mpi In-Reply-To: References: Message-ID: On Thu, Nov 29, 2018 at 4:23 AM Edoardo alinovi wrote: > Hello guys, > > thank you very much for you suggestions and sorry for getting back you > late. Unfortunately, actually my attempts to compile with intel are not > successful. > > Here my command: > Do NOT sudo the configure. This is really dangerous, and as you see 'root' has a different path than you do. Run configure normally and make, and only use 'sudo' for 'make install'. Thanks, Matt > sudo ./configure --prefix=/home/edo/software/petsc-3.10.2 > PETSC_ARCH=arch-intel-opt --with-cc=mpiicc --with-cxx=mpiiccp > --with-fc=mpiifort FOPTFLAGS='-O3' COPTFLAGS='-O3' CXXOPTFLAGS='-O3' > --with-blas-lapack-dir=/home/edo/intel/mkl/lib/intel64/ --with-debugging=no > --download-fblaslapack=1 --download-superlu_dist --download-mumps > --download-hypre --download-metis --download-parmetis > > The log file states that: > > ------------------ LOG ------------------ > > TEST checkCCompiler from > config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) > TESTING: checkCCompiler from > config.setCompilers(config/BuildSystem/config/setCompilers.py:587) > Locate a functional C compiler > Checking for program /usr/sbin/mpiicc...not found > Checking for program /usr/bin/mpiicc...not found > Checking for program /sbin/mpiicc...not found > Checking for program /bin/mpiicc...not found > Checking for program > /home/edo/software/petsc_intel/lib/petsc/bin/win32fe/mpiicc...not found > > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > > ------------------------------------------------------------------------------- > C compiler you provided with -with-cc=mpiicc does not work. > > ------------------------------------ > > It seems that petsc is not able to find mpiicc. However, the path to intel > is well defined in my .bashrc and I can easily compile a test file with > those compiler. > > if I put the full path in --with-cc= ... then I get: > > ------------------ LOG ------------------ > > TEST checkCCompiler from > config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) > TESTING: checkCCompiler from > config.setCompilers(config/BuildSystem/config/setCompilers.py:587) > Locate a functional C compiler > Checking for program > /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc...found > Defined make macro "CC" to > "/home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc" > Pushing language C > All intermediate test results are stored in > /tmp/petsc-ZLGfap > All intermediate test results are stored in > /tmp/petsc-ZLGfap/config.setCompilers > Executing: > /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc > -c -o /tmp/petsc-ZLGfap/config.setCompilers/conftest.o > -I/tmp/petsc-ZLGfap/config.setCompilers > /tmp/petsc-ZLGfap/config.setCompilers/conftest.c > Possible ERROR while running compiler: exit code 32512 > stderr: > /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc: > line 557: icc: command not found > Source: > #include "confdefs.h" > #include "conffix.h" > > int main() { > ; > return 0; > } > Popping language C > Error testing C compiler: Cannot compile C with > /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc. > Deleting "CC" > > ------------------------------------ > > Do you see any error in my setup? > > Thank you very much! > > ------ > > Edoardo Alinovi, Ph.D. > > DICCA, Scuola Politecnica, > Universita' degli Studi di Genova, > 1, via Montallegro, > 16145 Genova, Italy > > > Il giorno mar 27 nov 2018 alle ore 17:16 Matthew Knepley < > knepley at gmail.com> ha scritto: > >> On Tue, Nov 27, 2018 at 6:25 AM Edoardo alinovi via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Dear users, >>> >>> I have installed intel parallel studio on my workstation and thus I >>> would like to take advantage of intel compiler. >>> >>> Before messing up my installation, have you got some guidelines to >>> survive at this attempt? I have found here in the mailing list the >>> following instructions: >>> >>> --with-cc=icc --with-fc=ifort --with-mpi-include=/path-to-intel >>> --with-mpi-lib=/path-to-intel >>> >> >> Yes, this looks right. >> >> >>> Are they correct? >>> >>> Also I have an already existing and clean installation of petsc using >>> openmpi. I would like to retain this installtion since it is working very >>> well and switching between the two somehow. Any tips on this? >>> >> >> When you do the Intel configuration, use a different name like >> --PETSC_ARCH=arch-intelps-opt >> >> Thanks, >> >> Matt >> >> >>> I will never stop to say thank you for your precious support! >>> >>> Edoardo >>> >>> ------ >>> >>> Edoardo Alinovi, Ph.D. >>> >>> DICCA, Scuola Politecnica, >>> Universita' degli Studi di Genova, >>> 1, via Montallegro, >>> 16145 Genova, Italy >>> >>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 29 07:07:00 2018 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 29 Nov 2018 14:07:00 +0100 Subject: [petsc-users] Compile petsc using intel mpi In-Reply-To: References: Message-ID: Ok, this makes sense, but the reason to use sudo was this one: [Errno 13] Permission denied: 'RDict.log' File "./config/configure.py", line 391, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/home/edo/software/petsc_intel/config/BuildSystem/config/framework.py", line 80, in __init__ argDB = RDict.RDict(load = loadArgDB) File "/home/edo/software/petsc_intel/config/BuildSystem/RDict.py", line 90, in __init__ self.setupLogFile() File "/home/edo/software/petsc_intel/config/BuildSystem/RDict.py", line 145, in setupLogFile self.logFile = file(filename, 'a') ------ Edoardo Alinovi, Ph.D. DICCA, Scuola Politecnica, Universita' degli Studi di Genova, 1, via Montallegro, 16145 Genova, Italy Il giorno gio 29 nov 2018 alle ore 14:10 Matthew Knepley ha scritto: > On Thu, Nov 29, 2018 at 4:23 AM Edoardo alinovi > wrote: > >> Hello guys, >> >> thank you very much for you suggestions and sorry for getting back you >> late. Unfortunately, actually my attempts to compile with intel are not >> successful. >> >> Here my command: >> > > Do NOT sudo the configure. This is really dangerous, and as you see 'root' > has a different path than you do. Run configure normally > and make, and only use 'sudo' for 'make install'. > > Thanks, > > Matt > > >> sudo ./configure --prefix=/home/edo/software/petsc-3.10.2 >> PETSC_ARCH=arch-intel-opt --with-cc=mpiicc --with-cxx=mpiiccp >> --with-fc=mpiifort FOPTFLAGS='-O3' COPTFLAGS='-O3' CXXOPTFLAGS='-O3' >> --with-blas-lapack-dir=/home/edo/intel/mkl/lib/intel64/ --with-debugging=no >> --download-fblaslapack=1 --download-superlu_dist --download-mumps >> --download-hypre --download-metis --download-parmetis >> >> The log file states that: >> >> ------------------ LOG ------------------ >> >> TEST checkCCompiler from >> config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) >> TESTING: checkCCompiler from >> config.setCompilers(config/BuildSystem/config/setCompilers.py:587) >> Locate a functional C compiler >> Checking for program /usr/sbin/mpiicc...not found >> Checking for program /usr/bin/mpiicc...not found >> Checking for program /sbin/mpiicc...not found >> Checking for program /bin/mpiicc...not found >> Checking for program >> /home/edo/software/petsc_intel/lib/petsc/bin/win32fe/mpiicc...not found >> >> ******************************************************************************* >> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for >> details): >> >> ------------------------------------------------------------------------------- >> C compiler you provided with -with-cc=mpiicc does not work. >> >> ------------------------------------ >> >> It seems that petsc is not able to find mpiicc. However, the path to >> intel is well defined in my .bashrc and I can easily compile a test file >> with those compiler. >> >> if I put the full path in --with-cc= ... then I get: >> >> ------------------ LOG ------------------ >> >> TEST checkCCompiler from >> config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) >> TESTING: checkCCompiler from >> config.setCompilers(config/BuildSystem/config/setCompilers.py:587) >> Locate a functional C compiler >> Checking for program >> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc...found >> Defined make macro "CC" to >> "/home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc" >> Pushing language C >> All intermediate test results are stored in >> /tmp/petsc-ZLGfap >> All intermediate test results are stored in >> /tmp/petsc-ZLGfap/config.setCompilers >> Executing: >> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc >> -c -o /tmp/petsc-ZLGfap/config.setCompilers/conftest.o >> -I/tmp/petsc-ZLGfap/config.setCompilers >> /tmp/petsc-ZLGfap/config.setCompilers/conftest.c >> Possible ERROR while running compiler: exit code 32512 >> stderr: >> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc: >> line 557: icc: command not found >> Source: >> #include "confdefs.h" >> #include "conffix.h" >> >> int main() { >> ; >> return 0; >> } >> Popping language C >> Error testing C compiler: Cannot compile C with >> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc. >> Deleting "CC" >> >> ------------------------------------ >> >> Do you see any error in my setup? >> >> Thank you very much! >> >> ------ >> >> Edoardo Alinovi, Ph.D. >> >> DICCA, Scuola Politecnica, >> Universita' degli Studi di Genova, >> 1, via Montallegro, >> 16145 Genova, Italy >> >> >> Il giorno mar 27 nov 2018 alle ore 17:16 Matthew Knepley < >> knepley at gmail.com> ha scritto: >> >>> On Tue, Nov 27, 2018 at 6:25 AM Edoardo alinovi via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> >>>> Dear users, >>>> >>>> I have installed intel parallel studio on my workstation and thus I >>>> would like to take advantage of intel compiler. >>>> >>>> Before messing up my installation, have you got some guidelines to >>>> survive at this attempt? I have found here in the mailing list the >>>> following instructions: >>>> >>>> --with-cc=icc --with-fc=ifort --with-mpi-include=/path-to-intel >>>> --with-mpi-lib=/path-to-intel >>>> >>> >>> Yes, this looks right. >>> >>> >>>> Are they correct? >>>> >>>> Also I have an already existing and clean installation of petsc using >>>> openmpi. I would like to retain this installtion since it is working very >>>> well and switching between the two somehow. Any tips on this? >>>> >>> >>> When you do the Intel configuration, use a different name like >>> --PETSC_ARCH=arch-intelps-opt >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> I will never stop to say thank you for your precious support! >>>> >>>> Edoardo >>>> >>>> ------ >>>> >>>> Edoardo Alinovi, Ph.D. >>>> >>>> DICCA, Scuola Politecnica, >>>> Universita' degli Studi di Genova, >>>> 1, via Montallegro, >>>> 16145 Genova, Italy >>>> >>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 29 07:19:35 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 29 Nov 2018 08:19:35 -0500 Subject: [petsc-users] Compile petsc using intel mpi In-Reply-To: References: Message-ID: On Thu, Nov 29, 2018 at 8:17 AM Edoardo alinovi wrote: > Ok, this makes sense, but the reason to use sudo was this one: > sudo rm RDict.log Matt > [Errno 13] Permission denied: 'RDict.log' > File "./config/configure.py", line 391, in petsc_configure > framework = > config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], > loadArgDB = 0) > File > "/home/edo/software/petsc_intel/config/BuildSystem/config/framework.py", > line 80, in __init__ > argDB = RDict.RDict(load = loadArgDB) > File "/home/edo/software/petsc_intel/config/BuildSystem/RDict.py", line > 90, in __init__ > self.setupLogFile() > File "/home/edo/software/petsc_intel/config/BuildSystem/RDict.py", line > 145, in setupLogFile > self.logFile = file(filename, 'a') > > ------ > > Edoardo Alinovi, Ph.D. > > DICCA, Scuola Politecnica, > Universita' degli Studi di Genova, > 1, via Montallegro, > 16145 Genova, Italy > > > Il giorno gio 29 nov 2018 alle ore 14:10 Matthew Knepley < > knepley at gmail.com> ha scritto: > >> On Thu, Nov 29, 2018 at 4:23 AM Edoardo alinovi < >> edoardo.alinovi at gmail.com> wrote: >> >>> Hello guys, >>> >>> thank you very much for you suggestions and sorry for getting back you >>> late. Unfortunately, actually my attempts to compile with intel are not >>> successful. >>> >>> Here my command: >>> >> >> Do NOT sudo the configure. This is really dangerous, and as you see >> 'root' has a different path than you do. Run configure normally >> and make, and only use 'sudo' for 'make install'. >> >> Thanks, >> >> Matt >> >> >>> sudo ./configure --prefix=/home/edo/software/petsc-3.10.2 >>> PETSC_ARCH=arch-intel-opt --with-cc=mpiicc --with-cxx=mpiiccp >>> --with-fc=mpiifort FOPTFLAGS='-O3' COPTFLAGS='-O3' CXXOPTFLAGS='-O3' >>> --with-blas-lapack-dir=/home/edo/intel/mkl/lib/intel64/ --with-debugging=no >>> --download-fblaslapack=1 --download-superlu_dist --download-mumps >>> --download-hypre --download-metis --download-parmetis >>> >>> The log file states that: >>> >>> ------------------ LOG ------------------ >>> >>> TEST checkCCompiler from >>> config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) >>> TESTING: checkCCompiler from >>> config.setCompilers(config/BuildSystem/config/setCompilers.py:587) >>> Locate a functional C compiler >>> Checking for program /usr/sbin/mpiicc...not found >>> Checking for program /usr/bin/mpiicc...not found >>> Checking for program /sbin/mpiicc...not found >>> Checking for program /bin/mpiicc...not found >>> Checking for program >>> /home/edo/software/petsc_intel/lib/petsc/bin/win32fe/mpiicc...not found >>> >>> ******************************************************************************* >>> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log >>> for details): >>> >>> ------------------------------------------------------------------------------- >>> C compiler you provided with -with-cc=mpiicc does not work. >>> >>> ------------------------------------ >>> >>> It seems that petsc is not able to find mpiicc. However, the path to >>> intel is well defined in my .bashrc and I can easily compile a test file >>> with those compiler. >>> >>> if I put the full path in --with-cc= ... then I get: >>> >>> ------------------ LOG ------------------ >>> >>> TEST checkCCompiler from >>> config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) >>> TESTING: checkCCompiler from >>> config.setCompilers(config/BuildSystem/config/setCompilers.py:587) >>> Locate a functional C compiler >>> Checking for program >>> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc...found >>> Defined make macro "CC" to >>> "/home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc" >>> Pushing language C >>> All intermediate test results are stored in >>> /tmp/petsc-ZLGfap >>> All intermediate test results are stored in >>> /tmp/petsc-ZLGfap/config.setCompilers >>> Executing: >>> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc >>> -c -o /tmp/petsc-ZLGfap/config.setCompilers/conftest.o >>> -I/tmp/petsc-ZLGfap/config.setCompilers >>> /tmp/petsc-ZLGfap/config.setCompilers/conftest.c >>> Possible ERROR while running compiler: exit code 32512 >>> stderr: >>> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc: >>> line 557: icc: command not found >>> Source: >>> #include "confdefs.h" >>> #include "conffix.h" >>> >>> int main() { >>> ; >>> return 0; >>> } >>> Popping language C >>> Error testing C compiler: Cannot compile C with >>> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc. >>> Deleting "CC" >>> >>> ------------------------------------ >>> >>> Do you see any error in my setup? >>> >>> Thank you very much! >>> >>> ------ >>> >>> Edoardo Alinovi, Ph.D. >>> >>> DICCA, Scuola Politecnica, >>> Universita' degli Studi di Genova, >>> 1, via Montallegro, >>> 16145 Genova, Italy >>> >>> >>> Il giorno mar 27 nov 2018 alle ore 17:16 Matthew Knepley < >>> knepley at gmail.com> ha scritto: >>> >>>> On Tue, Nov 27, 2018 at 6:25 AM Edoardo alinovi via petsc-users < >>>> petsc-users at mcs.anl.gov> wrote: >>>> >>>>> Dear users, >>>>> >>>>> I have installed intel parallel studio on my workstation and thus I >>>>> would like to take advantage of intel compiler. >>>>> >>>>> Before messing up my installation, have you got some guidelines to >>>>> survive at this attempt? I have found here in the mailing list the >>>>> following instructions: >>>>> >>>>> --with-cc=icc --with-fc=ifort --with-mpi-include=/path-to-intel >>>>> --with-mpi-lib=/path-to-intel >>>>> >>>> >>>> Yes, this looks right. >>>> >>>> >>>>> Are they correct? >>>>> >>>>> Also I have an already existing and clean installation of petsc using >>>>> openmpi. I would like to retain this installtion since it is working very >>>>> well and switching between the two somehow. Any tips on this? >>>>> >>>> >>>> When you do the Intel configuration, use a different name like >>>> --PETSC_ARCH=arch-intelps-opt >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> I will never stop to say thank you for your precious support! >>>>> >>>>> Edoardo >>>>> >>>>> ------ >>>>> >>>>> Edoardo Alinovi, Ph.D. >>>>> >>>>> DICCA, Scuola Politecnica, >>>>> Universita' degli Studi di Genova, >>>>> 1, via Montallegro, >>>>> 16145 Genova, Italy >>>>> >>>>> >>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From natacha.bereux at gmail.com Thu Nov 29 07:41:41 2018 From: natacha.bereux at gmail.com (Natacha BEREUX) Date: Thu, 29 Nov 2018 14:41:41 +0100 Subject: [petsc-users] Fortran interface of some petsc routines seem to be missing In-Reply-To: References: Message-ID: Dear Barry thank you very much for this very detailed answer. It's perfectly clear now. I'll start writing the missing interfaces I need to wait until I have made progress on writing the interfaces before deciding whether to create a pull request or send part of the interface definitions. In any case, thank you again for your help, Best regards Natacha On Tue, Nov 27, 2018 at 10:49 PM Smith, Barry F. wrote: > > PETSc uses the Sowing packages bfort tool for automatically generating > "Fortran stub functions" and interface definitions. There are certain C > functions that bfort cannot handling including (at least) > > 1) functions with character string arguments > 2) functions with function pointer arguments (such as SNESSetFunction()) > 3) functions that support polymorphism between array and non-array > arguments (for example VecSetValues() where the ix and y arguments can be > either scalars or arrays). > 4) functions that can take a NULL argument. > > Functions that bfort can automatically generate the stub and the > interface definition are marked with /*@ at the beginning of the comment > above the function (see for example src/ksp/ksp/interface/itcreate.c > KSPCreate()). If bfort cannot work then the comment begins with /*@C (see > for example src/sys/objects/pinit.c PetscInitiialize()). > > It is our intention to provide manual Fortran stubs and interface for > functions that bfort cannot handle. > > As you note we do not currently provide all the interface definitions, > many even basic ones are missing. This is because we wrote the manual stub > functions many, many years ago when we wrote the C functions but only in > the past couple of years have we been providing interfaces and no one went > back and added all the required manual interface definitions. > > I did not know about the gfortran option -Wimplicit-interface; this is > a great tool for finding missing interfaces. Thanks for letting us know > about it. > > The manually generated stub functions are stored in files in the > subdirectories ftn-custom from where the C function is defined. The > filename is the C file name prepended with a Z and appended with an f. For > example snes.c becomes ftn-custom/zsnesf.c The manually generated > interface definitions should go into the file src/XXX/f90-mod/petscYYY.h90 > for example the interfaces for VecSetValues are in > src/vec/f90-mod/petscvec.h90 > > If you are ambitious and know about pull requests we'd be very happy to > accept a pull request that provided the missing manual interface > definitions. If you are less ambitious you could send us the interface > definitions that you create and we'll add them ourselves to the PETSc > repository. Or eventually we'll provide more and more of the interface > definitions as time permits. > > > Barry > > > > > > > > > On Nov 27, 2018, at 9:28 AM, Natacha BEREUX via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Hello, > > I work on a Fortran software that uses PETSc for linear solvers. > Therefore, we have a PETSc interface to convert our matrices to PETSc Mat. > > I have noticed several compiler warnings ( I use gfortran with > -Wimplicit-interface) during compilation. The warnings point out that some > (but not all) fortran interfaces are missing. > > > > The behaviour is the same when I compile a PETSC example. Below is > src/vec/vec/examples/tutorials/ex9f.F compiled with gfortran. > > > > mpif90 -c -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument > -Wimplicit-interface -g -O > -I/home/H03755/Librairies/petsc-3.10.1/include > -I/home/H03755/Librairies/petsc-3.10.1/linux-opt-mumps-ml-hypre-superlu/include > -I/home/H03755/dev/codeaster-prerequisites/v14/prerequisites/Mumps-512_consortium_aster3/MPI/include > -I/home/H03755/local/petsc/petsc-3.10.1/include -o ex9f.o ex9f.F90 > > ex9f.F90:34.53: > > > > call PetscInitialize(PETSC_NULL_CHARACTER,ierr) > > 1 > > Warning: Procedure 'petscinitialize' called with an implicit interface > at (1) > > ex9f.F90:42.91: > > > > if (size .ne. 2) then; call > PetscError(PETSC_COMM_WORLD,1,0,'Requires 2 processors'); call > MPIU_Abort(PETSC_COMM_WORLD,1); endif > > > 1 > > Warning: Procedure 'petscerror' called with an implicit interface at (1) > > ex9f.F90:42.128: > > > > if (size .ne. 2) then; call > PetscError(PETSC_COMM_WORLD,1,0,'Requires 2 processors'); call > MPIU_Abort(PETSC_COMM_WORLD,1); endif > > > 1 > > Warning: Procedure 'mpiu_abort' called with an implicit interface at (1) > > ex9f.F90:81.56: > > > > & PETSC_DECIDE,nghost,ifrom,tarray,gxs,ierr) > > 1 > > Warning: Procedure 'veccreateghostwitharray' called with an implicit > interface at (1) > > ex9f.F90:99.53: > > > > call VecGetOwnershipRange(gx,rstart,rend,ierr) > > 1 > > Warning: Procedure 'vecgetownershiprange' called with an implicit > interface at (1) > > ex9f.F90:115.93: > > > > call > PetscViewerGetSubViewer(PETSC_VIEWER_STDOUT_WORLD,PETSC_COMM_SELF,subviewer,ierr) > > > 1 > > Warning: Procedure 'petscviewergetsubviewer' called with an implicit > interface at (1) > > ex9f.F90:117.97: > > > > call > PetscViewerRestoreSubViewer(PETSC_VIEWER_STDOUT_WORLD,PETSC_COMM_SELF,subviewer,ierr) > > > 1 > > Warning: Procedure 'petscviewerrestoresubviewer' called with an implicit > interface at (1) > > ex9f.F90:121.31: > > > > call PetscFinalize(ierr) > > 1 > > Warning: Procedure 'petscfinalize' called with an implicit interface at > (1) > > > > > > Why does the compiler complain ? > > Did I miss something when I compiled PETSc library ? Is there a way to > properly generate all the Fortran interfaces in the compiled library ? > > Or is it normal that PETSc only generates some interfaces but not all ? > > In this case, is there a way to know which interfaces are explicitly and > automatically defined in PETSc library ? So that I can provide manually > the missing ones in my code ? > > > > Thanks a lot for your help ! > > Best regards, > > Natacha > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From overholt at capesim.com Thu Nov 29 09:18:31 2018 From: overholt at capesim.com (Matthew Overholt) Date: Thu, 29 Nov 2018 10:18:31 -0500 Subject: [petsc-users] Compile petsc using intel mpi In-Reply-To: References: Message-ID: Hi Edoardo, I also have the Intel Parallel Studio XE compilers and MPI installed, and I use it to build PETSc as follows. # Either add these to your .bashrc or run them on the command line before beginning the PETSc installation source /opt/intel/parallel_studio_xe_2018/bin/psxevars.sh intel64 export PETSC_DIR=/opt/petsc/petsc-3.10.2 export PETSC_ARCH=arch-intel-opt export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib # Next I create a configure script file in the PETSc directory, /opt/petsc/petsc-3.10.2/config-3.10.2opt echo echo Optimized Configure with Intel 2018 compilers, MKL PARDISO-CPARDISO, and Intel MPI echo No need to edit .bashrc except for PETSC_DIR and PETSC_ARCH echo Run psxevars first echo ./configure PETSC_ARCH=arch-intel-opt --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-clanguage=cxx --with-debugging=0 COPTFLAGS='-ipo -03 -xHost' CXXOPTFLAGS='-ipo -03 -xHost' FOPTFLAGS='-ipo -03 -xHost' --download-scalapack=yes --with-blas-lapack-dir=/opt/intel/mkl --with-mkl_pardiso-dir=/opt/intel/mkl --with-mkl_cpardiso-dir=/opt/intel/mkl # Then I run this configure script as a regular user. I make and check it as a regular user as well. ./config-3.10.2opt make all make check # My suggestion would be to completely delete your failed build, extract fresh source files from the downloaded tarfile, then try the above. Good luck, Matt Overholt CapeSym, Inc. (508) 653-7100 x204 overholt at capesim.com On Thu, Nov 29, 2018 at 8:19 AM Matthew Knepley via petsc-users < petsc-users at mcs.anl.gov> wrote: > On Thu, Nov 29, 2018 at 8:17 AM Edoardo alinovi > wrote: > >> Ok, this makes sense, but the reason to use sudo was this one: >> > > sudo rm RDict.log > > Matt > > >> [Errno 13] Permission denied: 'RDict.log' >> File "./config/configure.py", line 391, in petsc_configure >> framework = >> config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], >> loadArgDB = 0) >> File >> "/home/edo/software/petsc_intel/config/BuildSystem/config/framework.py", >> line 80, in __init__ >> argDB = RDict.RDict(load = loadArgDB) >> File "/home/edo/software/petsc_intel/config/BuildSystem/RDict.py", line >> 90, in __init__ >> self.setupLogFile() >> File "/home/edo/software/petsc_intel/config/BuildSystem/RDict.py", line >> 145, in setupLogFile >> self.logFile = file(filename, 'a') >> >> ------ >> >> Edoardo Alinovi, Ph.D. >> >> DICCA, Scuola Politecnica, >> Universita' degli Studi di Genova, >> 1, via Montallegro, >> 16145 Genova, Italy >> >> >> Il giorno gio 29 nov 2018 alle ore 14:10 Matthew Knepley < >> knepley at gmail.com> ha scritto: >> >>> On Thu, Nov 29, 2018 at 4:23 AM Edoardo alinovi < >>> edoardo.alinovi at gmail.com> wrote: >>> >>>> Hello guys, >>>> >>>> thank you very much for you suggestions and sorry for getting back you >>>> late. Unfortunately, actually my attempts to compile with intel are not >>>> successful. >>>> >>>> Here my command: >>>> >>> >>> Do NOT sudo the configure. This is really dangerous, and as you see >>> 'root' has a different path than you do. Run configure normally >>> and make, and only use 'sudo' for 'make install'. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> sudo ./configure --prefix=/home/edo/software/petsc-3.10.2 >>>> PETSC_ARCH=arch-intel-opt --with-cc=mpiicc --with-cxx=mpiiccp >>>> --with-fc=mpiifort FOPTFLAGS='-O3' COPTFLAGS='-O3' CXXOPTFLAGS='-O3' >>>> --with-blas-lapack-dir=/home/edo/intel/mkl/lib/intel64/ --with-debugging=no >>>> --download-fblaslapack=1 --download-superlu_dist --download-mumps >>>> --download-hypre --download-metis --download-parmetis >>>> >>>> The log file states that: >>>> >>>> ------------------ LOG ------------------ >>>> >>>> TEST checkCCompiler from >>>> config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) >>>> TESTING: checkCCompiler from >>>> config.setCompilers(config/BuildSystem/config/setCompilerspy:587) >>>> Locate a functional C compiler >>>> Checking for program /usr/sbin/mpiicc...not found >>>> Checking for program /usr/bin/mpiicc...not found >>>> Checking for program /sbin/mpiicc...not found >>>> Checking for program /bin/mpiicc...not found >>>> Checking for program >>>> /home/edo/software/petsc_intel/lib/petsc/bin/win32fe/mpiicc...not found >>>> >>>> ******************************************************************************* >>>> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log >>>> for details): >>>> >>>> ------------------------------------------------------------------------------- >>>> C compiler you provided with -with-cc=mpiicc does not work. >>>> >>>> ------------------------------------ >>>> >>>> It seems that petsc is not able to find mpiicc. However, the path to >>>> intel is well defined in my .bashrc and I can easily compile a test file >>>> with those compiler. >>>> >>>> if I put the full path in --with-cc= ... then I get: >>>> >>>> ------------------ LOG ------------------ >>>> >>>> TEST checkCCompiler from >>>> config.setCompilers(/home/edo/software/petsc_intel/config/BuildSystem/config/setCompilers.py:587) >>>> TESTING: checkCCompiler from >>>> config.setCompilers(config/BuildSystem/config/setCompilers.py:587) >>>> Locate a functional C compiler >>>> Checking for program >>>> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc...found >>>> Defined make macro "CC" to >>>> "/home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc" >>>> Pushing language C >>>> All intermediate test results are stored in >>>> /tmp/petsc-ZLGfap >>>> All intermediate test results are stored in >>>> /tmp/petsc-ZLGfap/config.setCompilers >>>> Executing: >>>> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc >>>> -c -o /tmp/petsc-ZLGfap/config.setCompilers/conftest.o >>>> -I/tmp/petsc-ZLGfap/config.setCompilers >>>> /tmp/petsc-ZLGfap/config.setCompilers/conftest.c >>>> Possible ERROR while running compiler: exit code 32512 >>>> stderr: >>>> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc: >>>> line 557: icc: command not found >>>> Source: >>>> #include "confdefs.h" >>>> #include "conffix.h" >>>> >>>> int main() { >>>> ; >>>> return 0; >>>> } >>>> Popping language C >>>> Error testing C compiler: Cannot compile C with >>>> /home/edo/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiicc. >>>> Deleting "CC" >>>> >>>> ------------------------------------ >>>> >>>> Do you see any error in my setup? >>>> >>>> Thank you very much! >>>> >>>> ------ >>>> >>>> Edoardo Alinovi, Ph.D. >>>> >>>> DICCA, Scuola Politecnica, >>>> Universita' degli Studi di Genova, >>>> 1, via Montallegro, >>>> 16145 Genova, Italy >>>> >>>> >>>> Il giorno mar 27 nov 2018 alle ore 17:16 Matthew Knepley < >>>> knepley at gmail.com> ha scritto: >>>> >>>>> On Tue, Nov 27, 2018 at 6:25 AM Edoardo alinovi via petsc-users < >>>>> petsc-users at mcs.anl.gov> wrote: >>>>> >>>>>> Dear users, >>>>>> >>>>>> I have installed intel parallel studio on my workstation and thus I >>>>>> would like to take advantage of intel compiler. >>>>>> >>>>>> Before messing up my installation, have you got some guidelines to >>>>>> survive at this attempt? I have found here in the mailing list the >>>>>> following instructions: >>>>>> >>>>>> --with-cc=icc --with-fc=ifort --with-mpi-include=/path-to-intel >>>>>> --with-mpi-lib=/path-to-intel >>>>>> >>>>> >>>>> Yes, this looks right. >>>>> >>>>> >>>>>> Are they correct? >>>>>> >>>>>> Also I have an already existing and clean installation of petsc using >>>>>> openmpi. I would like to retain this installtion since it is working very >>>>>> well and switching between the two somehow. Any tips on this? >>>>>> >>>>> >>>>> When you do the Intel configuration, use a different name like >>>>> --PETSC_ARCH=arch-intelps-opt >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> I will never stop to say thank you for your precious support! >>>>>> >>>>>> Edoardo >>>>>> >>>>>> ------ >>>>>> >>>>>> Edoardo Alinovi, Ph.D. >>>>>> >>>>>> DICCA, Scuola Politecnica, >>>>>> Universita' degli Studi di Genova, >>>>>> 1, via Montallegro, >>>>>> 16145 Genova, Italy >>>>>> >>>>>> >>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Thu Nov 29 09:38:49 2018 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Thu, 29 Nov 2018 15:38:49 +0000 Subject: [petsc-users] A question regarding a potential use case for DMNetwork In-Reply-To: References: Message-ID: A simple example of DMNetwork, an electric circuit: petsc/src/ksp/ksp/examples/tutorials/network/ex1.c Hong On Wed, Nov 28, 2018 at 5:08 PM Matthew Knepley via petsc-users > wrote: On Wed, Nov 28, 2018 at 3:51 PM Markus Lohmayer via petsc-users > wrote: Dear PETSc users and developers, in particular those experienced with the relatively new DMNetwork object, I would like to get some advice on wether it makes sense for my application to be built using this PETSc feature or if I am equally well served if I use plain Vec and Mat objects. In the latter case, you might nevertheless have some good advice for a novice PETSc user or you might know about something that helps me to come up with a well architected solution. So the application context is closely linked to LTI state-space models (and in particular two-port / n-port network models and their interconnections): x,t = A x + B u y = C x + D u More specifically, input ?u' and output ?y' are vectors (of same length for all components). Different components will have different dimension of state ?x' (and hence also ?A', ?B', ?C'). These component models then have to be interconnected according to a given topology: If the interconnection topology is 1D, then yes DMNetwork is designed to do this. At the simplest level, you could just use a 1D DMPlex, which is inside DMNetwork, and hand code the Section (data layout) and residual/Jacobian assembly. DMNetwork is a higher level around this that lets you put "components" down on vertices and edges that translate to data layout and residual evaluation. I confess to not understanding the residual part all the way. If you start trying to modify an example, we can help you I think. Thanks, Matt Some pair of outputs of model 1 feeds into the corresponding pair of inputs of model 2 (and also the other way round by symmetry), etc. After most (or even all) of the original inputs 'u_i' / outputs 'y_i' have been eliminated (based on the given interconnection structure amongst components), it will be necessary to use an iterative eigenvalue solver to obtain eigenvectors for some interesting part of the spectrum. The models will probably not be "very large" in the foreseeable future but this still doesn?t make e.g. MATLAB?s control toolbox an option. I have seen the presentation by Hong Zhang (1) at this year?s user meeting and I have looked at the paper ?Scalable Multiphysics Network Simulation Using PETSc DMNetwork?. The use cases and concept of network presented therein were slightly different from this one of interconnected multi-ports in state-space formulation. Thanks you very much for your advice and reading this post. Best, Markus -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tempohoper at gmail.com Thu Nov 29 09:45:35 2018 From: tempohoper at gmail.com (Sal Am) Date: Thu, 29 Nov 2018 15:45:35 +0000 Subject: [petsc-users] Solving complex linear sparse matrix in parallel + external library In-Reply-To: References: Message-ID: It seems to be working now, $ mpiexec -n 4 ./solveSystem -ksp_type gmres -pc_type lu -pc_factor_mat_solver_type mumps -ksp_max_it 1 -ksp_monitor_true_residual 0 KSP preconditioned resid norm 2.536556771280e+03 true resid norm 4.215007800243e+05 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 7.020671994801e-11 true resid norm 5.239122706565e-08 ||r(i)||/||b|| 1.242968685909e-13 Thanks On Wed, Nov 28, 2018 at 3:59 PM Matthew Knepley wrote: > On Wed, Nov 28, 2018 at 10:26 AM Sal Am wrote: > >> Thank you indeed --download-mpich and using PETSC_ARCH/bin/mpiexec seems >> to work. >> >> Now I am wondering about the other problem namely getting the residual, >> is the residual only computed when using iterative solvers? Cause using >> richardson or gmres with 1 iteration while using MUMPS prints me nothing. >> > > We had an optimization which did not print the residual with richardson, > since it is used often inside of other solvers where > you do not want this. Are you sure that the residual does not print with > GMRES? > > Thanks, > > Matt > > >> On Tue, Nov 27, 2018 at 10:53 AM Matthew Knepley >> wrote: >> >>> On Tue, Nov 27, 2018 at 4:29 AM Sal Am wrote: >>> >>>> This can happen if you use an 'mpiexec' which is from a different MPI >>>>> than the one you compiled PETSc with. >>>>> >>>> >>>> that is odd, I tried removing the --download-mpich from the config and >>>> tried --with-mpi=1 (which should be default anyways) and retried it with >>>> --with-mpich=1. >>>> >>> >>> Your MPI is still broken. Some default MPIs, like those installed with >>> Apple, are broken. If you are on Apple >>> I would recommend using --download-mpich, but make sure you use >>> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec. >>> >>> If you reconfigure, you either have to specify a different --PETSC_ARCH, >>> or delete the $PETSC_DIR/$PETSC_ARCH completely. Otherwise, you can get bad >>> interaction with the libraries already sitting there. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Current reconfig file: >>>> '--download-mpich', >>>> '--download-mumps', >>>> '--download-scalapack', >>>> '--download-superlu_dist', >>>> '--with-cc=gcc', >>>> '--with-clanguage=cxx', >>>> '--with-cxx=g++', >>>> '--with-debugging=no', >>>> '--with-fc=gfortran', >>>> '--with-mpi=1', >>>> '--with-mpich=1', >>>> '--with-scalar-type=complex', >>>> 'PETSC_ARCH=linux-opt' >>>> Still does not work though, I tried executing ex11 in >>>> ksp/ksp/example/tutorial/ex11 which should solve a linear system in >>>> parallel by running mpiexec -n 2 but it prints out >>>> Mat Object: 1 MPI processes >>>> ... >>>> .. >>>> twice. >>>> What am I missing? >>>> >>>> Why would you want a minimum iterations? Why not set a tolerance and a >>>>> max? What would you achieve with a minimum? >>>>> >>>> >>>> I thought PETSc might not be iterating far enough, but after having had >>>> a look at KSPSetTolerances it makes more sense. >>>> >>>> Instead of 'preonly', where you do not want a residual, use >>>>> 'richardson' or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >>>>> >>>> >>>> So I tried that by executing: mpiexec -n 4 ./test -ksp_type richardson >>>> -pc_type lu -pc_factor_mat_solver_type mumps -ksp_max_it 1 >>>> on the command line. However, nothing prints out on the terminal (aside >>>> from PetscPrintf if I have them enabled). >>>> >>>> Thank you. >>>> >>>> On Fri, Nov 16, 2018 at 12:00 PM Matthew Knepley >>>> wrote: >>>> >>>>> On Fri, Nov 16, 2018 at 4:23 AM Sal Am via petsc-users < >>>>> petsc-users at mcs.anl.gov> wrote: >>>>> >>>>>> Hi, >>>>>> >>>>>> I have a few questions: >>>>>> >>>>>> 1. The following issue/misunderstanding: >>>>>> My code reads in two files one PETSc vector and one PETSc matrix (b >>>>>> and A from Ax=b, size ~65000x65000). >>>>>> and then calls KSP solver to solve it by running the following in the >>>>>> terminal: >>>>>> >>>>>> mpiexec - n 2 ./SolveSys -ksp_type preonly -pc_type lu >>>>>> -pc_factor_mat_solver mumps >>>>>> >>>>>> Now mumps is supposed to work in parallel and complex, but the code >>>>>> is not solved in parallel it seems. It just prints the result twice. Adding >>>>>> -log_view gives me >>>>>> >>>>>> "./SolveSys on a linux-opt named F8434 with 1 processor..." printed >>>>>> twice. >>>>>> >>>>> >>>>> This can happen if you use an 'mpiexec' which is from a different MPI >>>>> than the one you compiled PETSc with. >>>>> >>>>> >>>>>> 2. Using iterative solvers, I am having difficulty getting >>>>>> convergence. I found that there is a way to set the maximum number of >>>>>> iterations, but is there a minimum I can increase? >>>>>> >>>>> >>>>> Why would you want a minimum iterations? Why not set a tolerance and a >>>>> max? What would you achieve with a minimum? >>>>> >>>>> >>>>>> 3. The residual is not computed when using direct external solvers. >>>>>> What is proper PETSc way of doing this? >>>>>> >>>>> >>>>> Instead of 'preonly', where you do not want a residual, use >>>>> 'richardson' or 'gmres' with a max of 1 iterate (-ksp_max_it 1) >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> -----------------------------------------------------The >>>>>> code--------------------------------------- >>>>>> #include >>>>>> #include >>>>>> int main(int argc,char **args) >>>>>> { >>>>>> Vec x,b; /* approx solution, RHS */ >>>>>> Mat A; /* linear system matrix */ >>>>>> KSP ksp; /* linear solver context */ >>>>>> PetscReal norm; /* norm of solution error */ >>>>>> PC pc; >>>>>> PetscMPIInt rank, size; >>>>>> PetscViewer viewer; >>>>>> PetscInt its, i; >>>>>> PetscErrorCode ierr; >>>>>> PetscScalar *xa; >>>>>> PetscBool flg = PETSC_FALSE; >>>>>> >>>>>> ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return >>>>>> ierr; >>>>>> MPI_Comm_rank(PETSC_COMM_WORLD,&rank); >>>>>> MPI_Comm_size(PETSC_COMM_WORLD,&size); >>>>>> >>>>>> #if !defined(PETSC_USE_COMPLEX) >>>>>> SETERRQ(PETSC_COMM_WORLD,1,"This example requires complex >>>>>> numbers"); >>>>>> #endif >>>>>> /* >>>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>>> Compute the matrix and right-hand-side vector that define >>>>>> the linear system, Ax = b. >>>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>>> - - >>>>>> */ >>>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from >>>>>> Vector_b.dat ...\n");CHKERRQ(ierr); >>>>>> ierr = >>>>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Vector_b.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>>>>> ierr = VecCreate(PETSC_COMM_WORLD, &b);CHKERRQ(ierr); >>>>>> ierr = VecLoad(b,viewer); CHKERRQ(ierr); >>>>>> >>>>>> ierr = PetscPrintf(PETSC_COMM_WORLD,"reading matrix in binary from >>>>>> Matrix_A.dat ...\n");CHKERRQ(ierr); >>>>>> ierr = >>>>>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,"../../python/petscpy/Matrix_A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); >>>>>> ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); >>>>>> ierr = MatLoad(A,viewer);CHKERRQ(ierr); >>>>>> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); >>>>>> >>>>>> ierr = VecDuplicate(b,&x);CHKERRQ(ierr); >>>>>> >>>>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>>> - - >>>>>> Create the linear solver and set various options >>>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>>> - - >>>>>> */ >>>>>> PetscPrintf(PETSC_COMM_WORLD, "Creating KSP\n"); >>>>>> ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); >>>>>> >>>>>> >>>>>> PetscPrintf(PETSC_COMM_WORLD, "KSP Operators\n"); >>>>>> ierr = KSPSetOperators(ksp,A,A);CHKERRQ(ierr); >>>>>> >>>>>> ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); >>>>>> ierr = PCSetType(pc, PCLU);CHKERRQ(ierr); >>>>>> ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); >>>>>> >>>>>> /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>>> - - >>>>>> Solve the linear system >>>>>> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >>>>>> - - */ >>>>>> ierr = KSPSetUp(ksp);CHKERRQ(ierr); >>>>>> ierr = KSPSetUpOnBlocks(ksp);CHKERRQ(ierr); >>>>>> ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); >>>>>> PetscPrintf(PETSC_COMM_WORLD, "Solved"); >>>>>> >>>>>> /* >>>>>> Free work space. All PETSc objects should be destroyed when they >>>>>> are no longer needed. >>>>>> */ >>>>>> ierr = KSPDestroy(&ksp);CHKERRQ(ierr); >>>>>> ierr = VecDestroy(&x);CHKERRQ(ierr); >>>>>> ierr = VecDestroy(&b);CHKERRQ(ierr); >>>>>> ierr = MatDestroy(&A);CHKERRQ(ierr); >>>>>> ierr = PetscFinalize(); >>>>>> return ierr; >>>>>> } >>>>>> >>>>>> >>>>>> Kind regards, >>>>>> Sal >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tempohoper at gmail.com Thu Nov 29 09:49:01 2018 From: tempohoper at gmail.com (Sal Am) Date: Thu, 29 Nov 2018 15:49:01 +0000 Subject: [petsc-users] RAW binary write Message-ID: Is there a way to write the solution from the system Ax=b in raw binary instead of PETSc binary format? Currently I am doing: ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD, "../../python/petscpy/Vector_x_petsc.dat", FILE_MODE_WRITE, &viewer);CHKERRQ(ierr); ierr = VecView(x,viewer);CHKERRQ(ierr);CHKERRQ(ierr); ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); And then use PetscBinaryIO to read it back and save it using write('newformat', 'wb') to get to raw... however this approach is not good it seems as there are some troubles with little/big endian when using the resulting converted file on other systems for post-processing. Thanks, -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 29 10:39:13 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 29 Nov 2018 11:39:13 -0500 Subject: [petsc-users] RAW binary write In-Reply-To: References: Message-ID: On Thu, Nov 29, 2018 at 10:50 AM Sal Am via petsc-users < petsc-users at mcs.anl.gov> wrote: > Is there a way to write the solution from the system Ax=b in raw binary > instead of PETSc binary format? > What is "raw binary". You have to have some format. Matt > Currently I am doing: > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD, > "../../python/petscpy/Vector_x_petsc.dat", FILE_MODE_WRITE, > &viewer);CHKERRQ(ierr); > ierr = VecView(x,viewer);CHKERRQ(ierr);CHKERRQ(ierr); > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > > And then use PetscBinaryIO to read it back and save it using > write('newformat', 'wb') to get to raw... however this approach is not good > it seems as there are some troubles with little/big endian when using the > resulting converted file on other systems for post-processing. > > Thanks, > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Thu Nov 29 11:24:12 2018 From: balay at mcs.anl.gov (Balay, Satish) Date: Thu, 29 Nov 2018 17:24:12 +0000 Subject: [petsc-users] SPAM on mailing lists Message-ID: With the increase in SPAM on the mailing lists (petsc-dev and petsc-users) - I've disabled posting by non-subscribers [which was previously enabled for user convince]. My current plan is to use a white-list for valid non-subscribers. This way - valid non-subscribers can continue participating in the mailing lists. Satish From danyang.su at gmail.com Thu Nov 29 12:14:37 2018 From: danyang.su at gmail.com (Danyang Su) Date: Thu, 29 Nov 2018 10:14:37 -0800 Subject: [petsc-users] Error: DM global to natural SF was not created when DMSetUseNatural has already been called In-Reply-To: References: <99807fb6-29a1-6396-6ee6-ee7401e2802f@gmail.com> Message-ID: <1b41b57d-18c6-de86-97e3-cf41af4b1d2e@gmail.com> Hi Josh, Sorry to bother you again. I followed your suggestion but still cannot figure out what is wrong in the following code. The global vector vec_global is non-zero. The migration SF looks fine. But the vector after calling DMPlexGlobalToNaturalEnd is always zero. Attached is the mesh I use and screen output for the following code section. !c check vec_global, all vec_global is non-zero ????? if (rank == 0) then ??????? write(*,*) "vector view - petsc order" ????? end if ????? CALL VecView(vec_global,PETSC_VIEWER_STDOUT_WORLD,ierr) ????? CHKERRQ(ierr) !c check migration sf, looks correct. ????? if (rank == 0) then ??????? write(*,*) "check migration SF" ????? end if ????? call DMPlexGetMigrationSF(dmda_flow%da,sf,ierr) ????? CHKERRQ(ierr) ????? call PetscSFView(sf,PETSC_VIEWER_STDOUT_WORLD,ierr) ????? CHKERRQ(ierr) !c global to natural ordering ????? call DMPlexGlobalToNaturalBegin(dmda_flow%da,vec_global,???????? & ????????????????????????????????????? vec_natural,ierr) ????? CHKERRQ(ierr) ????? call DMPlexGlobalToNaturalEnd(dmda_flow%da,vec_global,?????????? & ??????????????????????????????????? vec_natural,ierr) ????? CHKERRQ(ierr) !c check vec_global, vec_natural is always zero ????? if (rank == 0) then ??????? write(*,*) "vector view - natural order" ????? end if ????? CALL VecView(vec_natural,PETSC_VIEWER_STDOUT_WORLD,ierr) ????? CHKERRQ(ierr) Thanks, Danyang On 2018-11-28 9:58 p.m., Josh L wrote: > you don't have to call DMPlexGetGlobalToNaturalSF > > DMPlexDistribute gives you one petscSF, and you will just use this > petscSF in DMPlexSetMigrationSF > > it will look like this > > ? If (nsize /= 1) then > ? ? CALL DMSetUseNatural(dm,PETSC_TRUE,ierr) > ? ? CALL DMPlexDistribute(dm,0,pointSF,dmmpi,ierr);CHKERRQ(ierr) > ? ? CALL DMPlexSetMigrationSF(dmmpi,pointSF,ierr) > ? ? CALL DMDestroy(dm,ierr) > ? ? dm=dmmpi > ? ? If (rank == 0) write(*,'(///,A)') "Distributed mesh" > ? ? CALL DMView(dm,PETSC_VIEWER_STDOUT_WORLD,ierr) > ? Endif > > then you use DMPlexGlobalToNatural or NaturalToGlobal > > Danyang Su > ? > 2018?11?28? ?? ??11:11??? > > Hi Josh, > > Thanks. DMPlexSetMigrationSF requires naturalSF as the input > parameters. Is there any easy way to get naturalSF after > DMPlexDistribute? > > I tried to use > > ????? call DMPlexGetGlobalToNaturalSF(dmda_flow%da,naturalSF,ierr) > ????? CHKERRQ(ierr) > > ????? call DMPlexSetMigrationSF(dmda_flow%da,naturalSF,ierr) > ????? CHKERRQ(ierr) > > after DMPlexDistribute but this does not solve the problem. > > I am not familar with graph theory that I am a little lost here. > Would you please give me a little more detail. > > Thanks, > > Danyang > > > On 18-11-28 07:52 PM, Josh L wrote: >> Hi Danyang, >> >> You have to call?DMPlexSetMigrationSF after DMPlexDistribute. >> Moreover , If you only run the code with 1 processor, then >> natural ordering is global ordering and the pointSF is not created. >> >> Thanks, >> Josh >> >> >> > > ? 2018?11?28? ?? >> ??9:17??? >> >> Send petsc-users mailing list submissions to >> petsc-users at mcs.anl.gov >> >> To subscribe or unsubscribe via the World Wide Web, visit >> https://lists.mcs.anl.gov/mailman/listinfo/petsc-users >> or, via email, send a message with subject or body 'help' to >> petsc-users-request at mcs.anl.gov >> >> >> You can reach the person managing the list at >> petsc-users-owner at mcs.anl.gov >> >> >> When replying, please edit your Subject line so it is more >> specific >> than "Re: Contents of petsc-users digest..." >> >> >> Today's Topics: >> >> ? ?1. Re:? Do we support to use NORM_X instead of NORM_2 to check >> ? ? ? the convergence of KSP and SNES? (Smith, Barry F.) >> ? ?2. Re:? A question regarding a potential use case for >> DMNetwork >> ? ? ? (Matthew Knepley) >> ? ?3.? Error: DM global to natural SF was not created when >> ? ? ? DMSetUseNatural has already been called (Danyang Su) >> ? ?4.? I miss you. (chantalle at miroirdelame.com >> ) >> >> >> ---------------------------------------------------------------------- >> >> Message: 1 >> Date: Wed, 28 Nov 2018 22:03:26 +0000 >> From: "Smith, Barry F." > > >> To: Fande Kong > >> Cc: PETSc users list > > >> Subject: Re: [petsc-users] Do we support to use NORM_X instead of >> ? ? ? ? NORM_2 to check the convergence of KSP and SNES? >> Message-ID: <9A9B128F-AA95-40C7-BB28-31F687E9E65A at anl.gov >> > >> Content-Type: text/plain; charset="us-ascii" >> >> >> >> > On Nov 28, 2018, at 3:20 PM, Fande Kong via petsc-users >> > wrote: >> > >> > Hi Developers, >> > >> > I just checked into the SNES and KSP code. We always hard >> code the Vec Norm as NORM_2 when computing the linear and >> nonlinear residuals. >> > >> > Does this mean we have to use norm_2 to check the >> convergence for both SNES and KSP? >> >> ? ?No, you can in theory use some other norm in a custom >> convergence test. BUT you get the NORM_2 essentially "for >> free" while using some other norm requires you to compute >> that norm; for SNES it is no big deal, just the cost of a >> VecNorm(NORM_INFINITY) for example. But for GMRES it is very >> expensive (one must compute the current solution then compute >> the residual then compute the residual's norm; because GMRES >> uses a recursive formula for the NORM_2 and does not actually >> compute the solution or the residual at each iteration only >> at the end). >> >> ? ?Barry >> >> > >> > >> > Fande, >> >> >> >> ------------------------------ >> >> Message: 2 >> Date: Wed, 28 Nov 2018 18:06:45 -0500 >> From: Matthew Knepley > > >> To: markus.lohmayer at fau.de >> Cc: PETSc > > >> Subject: Re: [petsc-users] A question regarding a potential >> use case >> ? ? ? ? for DMNetwork >> Message-ID: >> ? ? ? ? >> > > >> Content-Type: text/plain; charset="utf-8" >> >> On Wed, Nov 28, 2018 at 3:51 PM Markus Lohmayer via petsc-users < >> petsc-users at mcs.anl.gov > wrote: >> >> > Dear PETSc users and developers, >> > >> > in particular those experienced with the relatively new >> DMNetwork object, >> > I would like to get some advice on wether it makes sense for my >> > application to be built using this PETSc feature >> > or if I am equally well served if I use plain Vec and Mat >> objects. >> > >> > In the latter case, you might nevertheless have some good >> advice for a >> > novice PETSc user >> > or you might know about something that helps me to come up >> with a well >> > architected solution. >> > >> > So the application context is closely linked to LTI >> state-space models >> > (and in particular two-port / n-port network models and their >> > interconnections): >> > x,t = A x + B u >> > y? ?= C x + D u >> > >> > More specifically, input ?u' and output ?y' are vectors (of >> same length >> > for all components). >> > Different components will have different dimension of state >> ?x' (and hence >> > also ?A', ?B', ?C'). >> > >> > These component models then have to be interconnected >> according to a given >> > topology: >> > >> >> If the interconnection topology is 1D, then yes DMNetwork is >> designed to do >> this. At the simplest level, you >> could just use a 1D DMPlex, which is inside DMNetwork, and >> hand code the >> Section (data layout) and residual/Jacobian >> assembly. DMNetwork is a higher level around this that lets >> you put >> "components" down on vertices and edges >> that translate to data layout and residual evaluation. I >> confess to not >> understanding the residual part all the way. >> If you start trying to modify an example, we can help you I >> think. >> >> ? Thanks, >> >> ? ? ?Matt >> >> >> > Some pair of outputs of model 1? feeds into? the >> corresponding pair of >> > inputs of model 2? (and also the other way round by >> symmetry), etc. >> > >> > After most (or even all) of the original inputs 'u_i' / >> outputs 'y_i' have >> > been eliminated (based on the given interconnection >> structure amongst >> > components), >> > it will be necessary to use an iterative eigenvalue solver >> to obtain >> > eigenvectors for some interesting part of the spectrum. >> > >> > The models will probably not be "very large" in the >> foreseeable future but >> > this still doesn?t make e.g. MATLAB?s control toolbox an >> option. >> > >> > I have seen the presentation by Hong Zhang (1) at this >> year?s user meeting >> > and I have looked at the paper ?Scalable Multiphysics >> Network Simulation >> > Using PETSc DMNetwork?. >> > The use cases and concept of network presented therein were >> slightly >> > different from this one of interconnected multi-ports in >> state-space >> > formulation. >> > >> > Thanks you very much for your advice and reading this post. >> > >> > Best, >> > Markus >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results >> to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > > >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: >> >> >> ------------------------------ >> >> Message: 3 >> Date: Wed, 28 Nov 2018 17:56:56 -0800 >> From: Danyang Su > > >> To: PETSc > > >> Subject: [petsc-users] Error: DM global to natural SF was not >> created >> ? ? ? ? when DMSetUseNatural has already been called >> Message-ID: > > >> Content-Type: text/plain; charset=utf-8; format=flowed >> >> >> Dear All, >> >> I got the following error when using DMPlexGlobalToNatural >> function >> using 1 processor. >> >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: Object is in wrong state >> [0]PETSC ERROR: DM global to natural SF was not created. >> You must call DMSetUseNatural() before DMPlexDistribute(). >> >> [0]PETSC ERROR: See >> http://www.mcs.anl.gov/petsc/documentation/faq.html >> for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 >> >> The same code does not return error when using more than 2 >> processors, >> however, the vec_natural is always zero after calling >> DMPlexGlobalToNaturalEnd. >> >> >> DMSetUseNatural() has already been used before calling >> DMPlexDistribute. >> The code section looks like below >> >> ????? if (rank == 0) then >> >> ???????? call >> DMPlexCreateFromCellList(Petsc_Comm_World,ndim,0,0,?????? & >> num_nodes_per_cell,??????????????????????????????????????? & >> Petsc_False,dmplex_cells,ndim,??????????????????????? &? !use >> Petsc_True >> to create intermediate mesh entities (faces, edges), >> dmplex_verts,dmda_flow%da,ierr) >> ???????? CHKERRQ(ierr) >> >> ????? end if >> >> ?????? !c Set the flag for creating a mapping to the natural >> order on >> distribution >> ?????? call DMSetUseNatural(dmda_flow%da,PETSC_TRUE,ierr) >> ?????? CHKERRQ(ierr) >> >> ?????? !c distribute mesh over processes >> ?????? call >> DMPlexDistribute(dmda_flow%da,stencil_width,??????????????? & >> ???????????????????????????? PETSC_NULL_SF, distributedMesh,ierr) >> >> ?????? CHKERRQ(ierr) >> >> ?????? !c destroy original global mesh after distribution >> ?????? if (distributedMesh /= PETSC_NULL_DM) then >> ???????? call DMDestroy(dmda_flow%da,ierr) >> ???????? CHKERRQ(ierr) >> ???????? !c set the global mesh as distributed mesh >> ???????? dmda_flow%da = distributedMesh >> ?????? end if >> >> ?????? ... >> >> ???????? call >> DMPlexCreateSection(dmda_flow%da,dmda_flow%dim,?????????? & >> numFields,pNumComp,pNumDof,?????????? & >> numBC,pBcField,?????????????????????? & >> pBcCompIS,pBcPointIS,???????????????? & >> PETSC_NULL_IS,??????????????????????? & >> ????????????????????????????????? section,ierr) >> ???????? CHKERRQ(ierr) >> >> ???????? call PetscSectionSetFieldName(section,0,'flow',ierr) >> ???????? CHKERRQ(ierr) >> >> >> ???????? call DMSetSection(dmda_flow%da,section,ierr) >> ???????? CHKERRQ(ierr) >> >> ???????? call PetscSectionDestroy(section,ierr) >> ???????? CHKERRQ(ierr) >> >> ???????? call DMSetUp(dmda_flow%da,ierr) >> ???????? CHKERRQ(ierr) >> >> ????? ... >> >> >> ?????? !c global - natural order >> >> ?????? call DMCreateLocalVector(dmda_flow%da,vec_loc,ierr) >> ?????? CHKERRQ(ierr) >> >> ?????? call DMCreateGlobalVector(dmda_flow%da,vec_global,ierr) >> ?????? CHKERRQ(ierr) >> >> ?????? call DMCreateGlobalVector(dmda_flow%da,vec_natural,ierr) >> ?????? CHKERRQ(ierr) >> >> ?????? !c zero entries >> ?????? call VecZeroEntries(vec_loc,ierr) >> ?????? CHKERRQ(ierr) >> >> ?????? !Get a pointer to vector data when you need access to >> the array >> ?????? call VecGetArrayF90(vec_loc,vecpointer,ierr) >> ?????? CHKERRQ(ierr) >> >> ?????? do inode = 1, num_nodes >> ???????? vecpointer(inode) = node_idx_lg2pg(inode) !vector >> value using >> PETSc global order, negative ghost index has been reversed >> ?????? end do >> >> ?????? !Restore the vector when you no longer need access to >> the array >> ?????? call VecRestoreArrayF90(vec_loc,vecpointer,ierr) >> ?????? CHKERRQ(ierr) >> >> ?????? !Insert values into global vector >> ?????? call >> DMLocalToGlobalBegin(dmda_flow%da,vec_loc,INSERT_VALUES,??? & >> ???????????????????????????????? vec_global,ierr) >> ?????? CHKERRQ(ierr) >> >> ?????? call >> DMLocalToGlobalEnd(dmda_flow%da,vec_loc,INSERT_VALUES,????? & >> ???????????????????????????????? vec_global,ierr) >> ?????? CHKERRQ(ierr) >> >> >> ?????? !c global to natural ordering >> ?????? call >> DMPlexGlobalToNaturalBegin(dmda_flow%da,vec_global,???????? & >> ?????????????????????????????????????? vec_natural,ierr) >> ?????? CHKERRQ(ierr) >> >> ?????? call >> DMPlexGlobalToNaturalEnd(dmda_flow%da,vec_global,?????????? & >> ???????????????????????????????????? vec_natural,ierr) >> ?????? CHKERRQ(ierr) >> >> >> Is there anything missing in the code that >> DMPlexGlobalToNatural... does >> not work properly? >> >> Thanks, >> >> Danyang >> >> >> >> ------------------------------ >> >> Message: 4 >> Date: Wed, 28 Nov 2018 19:17:12 -0800 (MSK) >> From: chantalle at miroirdelame.com >> >> To: petsc-users at mcs.anl.gov >> Subject: [petsc-users] I miss you. >> Message-ID: >> <9Fjct2o-RpyVfYXQLI2r_q at ismtpd0742h6ebv4.miroirdelame.com >> > >> Content-Type: text/plain; charset="utf-8" >> >> http://adehinxy.ru/?adehinxy adehinxy >> http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst >> http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw >> abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux >> http://fhikpqvy.net/?fhikpqvy fhikpqvy >> https://bejkqu.info/?bejkqu bejkqu >> https://cdehnsxy.ru/?cdehnsxy cdehnsxy >> http://bcdhnopx.info/?bcdhnopx bcdhnopx >> https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw >> bdeftvw http://dgmnou.ru/?dgmnou dgmnou >> https://aiklnvx.com/?aiklnvx aiklnvx >> http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw >> http://cdehmquy.biz/?cdehmquy cdehmquy >> https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw >> bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz >> https://abesty.info/?abesty abesty >> https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz >> ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv >> acdlv http://fgimqw.net/?fgimqw fgimqw >> https://fgkrxz.info/?fgkrxz fgkrxz >> http://abfhnvxy.ua/?abfhnvxy abfhnvxy >> https://bcdeipsy.ru/?bcdeipsy bcdeipsy >> http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry >> fnpqry https://aklstwx.ru/?aklstwx aklstwx >> http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv >> dkmsuv http://afinptuw.ru/?afinptuw afinptuw >> https://bijnqvx.biz/?bijnqvx bijnqvx >> https://behlxz.info/?behlxz behlxz >> http://bgjkquyz.biz/?bgjkquyz bgjkquyz >> http://aehnpqw.net/?aehnpqw aehnpqw >> http://cghjmpqu.com/?cghjmpqu cghjmpqu >> http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz >> https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost >> bfgjmost https://bhnou.ua/?bhnou bhnou >> https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx >> abijmrtx http://acefnv.com/?acefnv acefnv >> http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz >> egjltuz http://afghpsv.net/?afghpsv afghpsv >> http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz >> befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw >> https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt >> cegpt https://klntvw.biz/?klntvw klntvw https://ejkmq. >> ?ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx >> https://dqrsuvw.com/?dqrsuvw dqrsuvw >> http://bcekrsu.net/?bcekrsu bcekrsu >> https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz >> abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr >> http://ceijlprw.net/?ceijlprw ceijlprw >> https://ijlqsw.net/?ijlqsw ijlqsw >> https://einrvxz.net/?einrvxz einrvxz >> https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz >> adkmqsz http://bchnox.com/?bchnox bchnox >> https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops >> dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy >> http://bdnpstu.info/?bdnpstu bdnpstu >> http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx >> gjnrx http://abcsvz.com/?abcsvz abcsvz >> http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz >> fkmouxz http://bcpsx.com/?bcpsx bcpsx >> http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq >> bcdepq http://kqswy.net/?kqswy kqswy >> https://bdgjnuy.ru/?bdgjnuy bdgjnuy >> https://cgiklsvz.info/?cgiklsvz cgiklsvz >> https://bfgnprwy.net/?bfgnprwy bfg >> ?nprwy https://bejmnqrw.com/?bejmnqrw bejmnqrw >> http://gilopqwy.ua/?gilopqwy gilopqwy >> https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy >> kmrvwy https://eglotuv.net/?eglotuv eglotuv >> http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx >> http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy >> https://gjoquvwx.net/?gjoquvwx gjoquvwx >> https://abcfouw.biz/?abcfouw abcfouw >> https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz >> ijnruvz http://mqruv.ua/?mqruv mqruv >> https://afklmu.biz/?afklmu afklmu >> https://abcgptyz.biz/?abcgptyz abcgptyz >> https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx >> ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx >> hlnvx http://adiklmp.com/?adiklmp adiklmp >> https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw >> acmopvw http://cdmuw.ua/?cdmuw cdmuw >> http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy >> lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw >> http://bcfhmsz.net/?bcfhmsz bcfhmsz >> http://bcgknpry.biz/?bcgknpry bcgknpry https://bfl >> mnoq.info/?bflmnoq bflmnoq >> https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw >> gknosuw http://ajoruvz.com/?ajoruvz ajoruvz >> http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx >> deintwx https://eghijkuy.com/?eghijkuy eghijkuy >> http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy >> aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx >> http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz >> https://deklmuvw.com/?deklmuvw deklmuvw >> http://cfgimox.biz/?cfgimox cfgimox >> http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy >> acilsy http://bfhlnop.net/?bfhlnop bfhlnop >> https://fijluz.net/?fijluz fijluz >> https://dfjnostv.ua/?dfjnostv dfjnostv >> https://aginsty.net/?aginsty aginsty >> http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu >> agopu http://bdkopst.ua/?bdkopst bdkopst >> https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu >> cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz >> http://ijlmuvw.com/?ijlmuvw ijlmuvw >> https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy http://aefhty.inf >> ?o/?aefhty aefhty http://afjklsux.net/?afjklsux afjklsux >> http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx >> bdjkpx http://aciknovw.net/?aciknovw aciknovw >> https://aftuxy.ua/?aftuxy aftuxy >> https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz >> http://fijqrv.com/?fijqrv fijqrv >> https://ahmquvw.info/?ahmquvw ahmquvw >> https://ghklmqvy.com/?ghklmqvy ghklmqvy >> http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz >> aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy >> https://dghnqrty.ru/?dghnqrty dghnqrty >> https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv >> adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz >> http://bcdeghvx.ru/?bcdeghvx bcdeghvx >> http://bdfgruvy.com/?bdfgruvy bdfgruvy >> https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp >> abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz >> http://clqruwxy.com/?clqruwxy clqruwxy >> https://fhikoq.net/?fhikoq fhikoq >> https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz >> http://fhjkor.biz/?fhjkor fhjkor >> https://dfgkouyz.com/?dfgkouyz dfgkouyz >> https://acfhmqtz.info/?acfhmq >> ?tz acfhmqtz https://ejmnpt.biz/?ejmnpt ejmnpt >> http://dfgnrtu.ru/?dfgnrtu dfgnrtu >> https://diknswx.biz/?diknswx diknswx >> https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq >> https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux >> fhjltux http://ablmv.com/?ablmv ablmv >> http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr >> cnopr http://adgopz.ru/?adgopz adgopz >> http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr >> ghmqr https://cdetv.ru/?cdetv cdetv >> https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv >> djopv http://aiknorw.biz/?aiknorw aiknorw >> https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox >> https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq >> fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv >> dfhquv https://bfgjoq.com/?bfgjoq bfgjoq >> https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv >> cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz >> https://ghijkstx.info/?ghijkstx ghijkstx >> http://abilmsz.info/?abilmsz abilmsz https://b >> fhsvyz.ru/?bfhsvyz bfhsvyz >> http://eqsuv.ua/?eqsuv eqsuv http://ghinqsx.biz/?ghinqsx >> ghinqsx https://acgkmnpu.com/?acgkmnpu acgkmnpu >> http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz >> cdgkoz http://dimnosw.info/?dimnosw dimnosw >> http://abegnwx.biz/?abegnwx abegnwx >> http://abejnsx.net/?abejnsx abejnsx >> https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz >> abfsz https://imqrvy.net/?imqrvy imqrvy >> https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv >> http://bdfostyz.net/?bdfostyz bdfostyz >> https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw >> afgruvw https://cefivyz.com/?cefivyz cefivyz >> https://gmqrt.com/?gmqrt gmqrt >> https://acdimqrw.info/?acdimqrw acdimqrw >> http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs >> https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx >> adfjstx https://cghlqt.net/?cghlqt cghlqt >> http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv >> ahluv http://acdfgn.net/?acdfgn acdfgn >> https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy >> ghnuvy http >> ?://aeiky.com/?aeiky aeiky >> https://bdilnov.info/?bdilnov bdilnov >> https://cjlnpqru.biz/?cjlnpqru cjlnpqru >> https://cgjqtv.biz/?cgjqtv cgjqtv >> https://abcnsvz.biz/?abcnsvz abcnsvz >> http://abegopsu.biz/?abegopsu abegopsu >> https://bcmstuvy.net/?bcmstuvy bcmstuvy >> https://bhjknsuz.net/?bhjknsuz bhjknsuz >> http://aceklno.info/?aceklno aceklno >> http://diotvyz.ru/?diotvyz diotvyz >> https://aefkmruw.info/?aefkmruw aefkmruw >> https://cdghlmnz.info/?cdghlmnz cdghlmnz >> https://bcgijmow.net/?bcgijmow bcgijmow >> http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt >> bhjnrt https://eimoptu.ua/?eimoptu eimoptu >> https://achptv.info/?achptv achptv https://boptux.biz/?boptux >> boptux http://acmovz.ua/?acmovz acmovz >> https://abfmnpw.info/?abfmnpw abfmnpw >> https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps >> ehnps https://dnpuw.biz/?dnpuw dnpuw >> https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz >> fmuyz http://bejkvz.info/?bejkvz bejkvz >> https://dfkmnyz.ua/?dfkmnyz dfkmnyz http://cfgnsvx.biz/?cfgnsv >> ?x cfgnsvx http://abdjsw.ru/?abdjsw abdjsw >> https://abelrvx.biz/?abelrvx abelrvx >> https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz >> hmnqz https://dhkntv.ru/?dhkntv dhkntv >> http://bgqsyz.biz/?bgqsyz bgqsyz >> http://eghkmsux.net/?eghkmsux eghkmsux >> https://acehlrsv.info/?acehlrsv acehlrsv >> https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu >> cefmtu https://hjnqrt.net/?hjnqrt hjnqrt >> http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu >> cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr >> cdfqr http://adelmvz.ru/?adelmvz adelmvz >> http://abflnsvx.com/?abflnsvx abflnsvx >> http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux >> fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz >> http://cdflstwz.net/?cdflstwz cdflstwz >> https://egjostux.ua/?egjostux egjostux >> https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz >> bcfgmz http://dfnouz.ua/?dfnouz dfnouz >> http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw >> dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl >> ? dijkl http://aeglu.com/?aeglu aeglu >> http://dgikmpy.info/?dgikmpy dgikmpy >> https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu >> acfglopu https://cdjqx.net/?cdjqx cdjqx >> https://aefimst.biz/?aefimst aefimst >> http://bfikmuw.net/?bfikmuw bfikmuw >> http://abhioqvw.ua/?abhioqvw abhioqvw >> http://abdilmo.net/?abdilmo abdilmo >> https://bhlovx.info/?bhlovx bhlovx >> https://fgvwxz.info/?fgvwxz fgvwxz >> http://afhjpqsv.biz/?afhjpqsv afhjpqsv >> http://deilrsv.com/?deilrsv deilrsv >> http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw >> bfkmtw https://adkxy.ru/?adkxy adkxy >> https://blpqxy.com/?blpqxy blpqxy >> https://efgilor.net/?efgilor efgilor >> https://abcdky.info/?abcdky abcdky >> https://befgikqu.ru/?befgikqu befgikqu >> https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu >> fjlsu https://kpquw.com/?kpquw kpquw >> https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy >> http://dijopsz.biz/?dijopsz dijopsz >> http://acfhrsz.ua/?acfhrsz acfhrsz >> https://bfhrstvy.ua/?bfhrstvy bfhrstvy >> http://afglw.info/?afglw afglw >> https://bcjqrvw.net/?bcjqrvw bcjqrvw https://hiotz.ru/?hiotz >> hiotz https://abefmswx.biz/?abefmswx abefmswx >> http://acgkopyz.ua/?acgkopyz acgkopyz >> https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz >> befmquwz https://hijory.ua/?hijory hijory >> http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx >> ahiksx https://aghmxy.ru/?aghmxy aghmxy >> https://hnopqt.net/?hnopqt hnopqt >> https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv >> hknpv https://abdfmotw.net/?abdfmotw abdfmotw >> http://bcfhpvx.biz/?bcfhpvx bcfhpvx >> http://beimrty.net/?beimrty beimrty >> http://dgnsvwy.ua/?dgnsvwy dgnsvwy >> https://aghjstvw.com/?aghjstvw aghjstvw >> http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop >> finop http://bcdhsy.biz/?bcdhsy bcdhsy >> http://cfhmnvy.net/?cfhmnvy cfhmnvy >> https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou >> akmou https://fgkntuw.biz/?fgkntuw fgkntuw >> https://adfjsw.com/?adfjsw adfjsw >> https://bdjmrsty.biz/?bdjmrsty bdjmrsty >> http://grtuz.ru/?grtuz grtuz https://cgiq >> w.info/?cgiqw cgiqw >> http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz >> dmnrz http://deglqtu.com/?deglqtu deglqtu >> http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw >> https://gnoqvxz.com/?gnoqvxz gnoqvxz >> https://hijkmsu.ru/?hijkmsu hijkmsu >> http://cistxyz.biz/?cistxyz cistxyz >> https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy >> hmqrwy https://crsuv.ru/?crsuv crsuv >> http://abejkmoz.ua/?abejkmoz abejkmoz >> http://abjpqu.ua/?abjpqu abjpqu >> https://ehjkopuz.info/?ehjkopuz ehjkopuz >> http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx >> emntvwx http://eghksx.info/?eghksx eghksx >> https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv >> cinopuv http://efijnptz.biz/?efijnptz efijnptz >> http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw >> denprw http://adglpwz.net/?adglpwz adglpwz >> http://dehimosu.biz/?dehimosu dehimosu >> https://abchprwy.com/?abchprwy abchprwy >> http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz >> cehnwyz http://aghlopy.ua/?aghlopy aghlopy http://fhjksuvz >> ?.com/?fhjksuvz fhjksuvz http://bijmy.com/?bijmy bijmy >> https://adhjw.com/?adhjw adhjw >> https://emnprwyz.info/?emnprwyz emnprwyz >> http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv >> cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst >> https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv >> degiptv http://hlnvy.ru/?hlnvy hlnvy >> http://gklmvx.info/?gklmvx gklmvx >> http://behlpstz.com/?behlpstz behlpstz >> https://efginz.com/?efginz efginz >> https://deilmnru.biz/?deilmnru deilmnru >> http://bdehoqux.ru/?bdehoqux bdehoqux >> http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw >> https://cfmtvy.net/?cfmtvy cfmtvy >> http://agkmqvxy.biz/?agkmqvxy agkmqvxy >> https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz >> ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz >> http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx >> http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy >> bgimy https://acdfhimy.info/?acdfhimy acdfhimy >> http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux >> lnopux >> https://iknqstux.ua/?iknqstux iknqstux >> https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv >> abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz >> http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz >> cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl >> fgikl http://ehikrs.ru/?ehikrs ehikrs >> http://beghqx.ru/?beghqx beghqx http://dhijlrsw.com/?dhijlrsw >> dhijlrsw http://elqst.ru/?elqst elqst >> https://abfklmns.biz/?abfklmns abfklmns >> https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu >> anoqu http://adikquz.com/?adikquz adikquz >> http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz >> fiqwxz http://bcivw.ru/?bcivw bcivw >> https://acdfnrz.ua/?acdfnrz acdfnrz >> https://jmopqtw.info/?jmopqtw jmopqtw >> https://abdeghtv.biz/?abdeghtv abdeghtv >> http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy >> forvy https://abgjknpv.com/?abgjknpv abgjknpv >> http://bcdgimr.info/?bcdgimr bcdgimr >> https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx >> bcdnqvx http://defjnqvw.com/ >> ??defjnqvw defjnqvw http://ekmouvwy.biz/?ekmouvwy ekmouvwy >> http://cijpqsxz.biz/?cijpqsxz cijpqsxz >> http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow >> cinow http://cdejmrs.info/?cdejmrs cdejmrs >> http://fgknprwz.info/?fgknprwz fgknprwz >> https://binpquxy.com/?binpquxy binpquxy >> https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv >> cemqtv http://bgnqsz.info/?bgnqsz bgnqsz >> http://cdfjprsv.ua/?cdfjprsv cdfjprsv >> https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw >> adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw >> ehkmnw http://kmsxy.ua/?kmsxy kmsxy >> http://benotvw.info/?benotvw benotvw >> https://dhinops.biz/?dhinops dhinops >> https://hklnpqtv.ru/?hklnpqtv hklnpqtv >> https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx >> https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp >> cdghjp http://abchnoqx.com/?abchnoqx abchnoqx >> http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy >> acdnqwy http://befilpv.com/?befilpv befilpv >> http://fijrux.biz/?fijrux fijrux https://aknqrtu >> z.ru/?aknqrtuz aknqrtuz >> http://abfoqwz.ru/?abfoqwz abfoqwz >> https://defikv.info/?defikv defikv >> http://abcoqtz.com/?abcoqtz abcoqtz >> https://dfirxz.net/?dfirxz dfirxz >> http://bdfipqux.net/?bdfipqux bdfipqux >> http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl >> http://ahikmpqt.info/?ahikmpqt ahikmpqt >> https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz >> afhosxz https://ehsvx.ru/?ehsvx ehsvx >> http://gknquvw.info/?gknquvw gknquvw >> https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy >> fhopy https://bcmpux.net/?bcmpux bcmpux >> https://adfmrsuy.info/?adfmrsuy adfmrsuy >> https://fmtuvx.net/?fmtuvx fmtuvx >> https://defjmswy.net/?defjmswy defjmswy >> https://ahijmrsz.info/?ahijmrsz ahijmrsz >> https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz >> https://behioqyz.ru/?behioqyz behioqyz >> https://admrsvxz.com/?admrsvxz admrsvxz >> https://cefhipqw.com/?cefhipqw cefhipqw >> http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz >> acfgksz http://hjlmqst.info/?hjlmqst hjlmqst http:/ >> ?/efgnrsux.net/?efgnrsux >> efgnrsux http://adflovz.info/?adflovz adflovz >> http://acopu.ua/?acopu acopu https://chilnqrs.ru/?chilnqrs >> chilnqrs https://blosy.net/?blosy blosy >> https://gijnpuxy.com/?gijnpuxy gijnpuxy >> https://bcemuvwz.ua/?bcemuvwz bcemuvwz >> http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr >> dfghilr http://bdlqstv.com/?bdlqstv bdlqstv >> http://adhjnrsz.com/?adhjnrsz adhjnrsz >> http://adiopqx.com/?adiopqx adiopqx >> http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx >> dgpsx http://afgrt.ru/?afgrt afgrt >> http://bghipuvz.ua/?bghipuvz bghipuvz >> http://egjmvw.com/?egjmvw egjmvw >> https://eginoswz.com/?eginoswz eginoswz >> http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno >> http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz >> aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy >> kruwy http://fhlqty.net/?fhlqty fhlqty >> http://chpuvyz.ru/?chpuvyz chpuvyz >> https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw >> cklsw https://fghkmvy.info/?fghkmvy f >> ?ghkmvy http://belmpvz.biz/?belmpvz belmpvz >> http://eimpxz.ru/?eimpxz eimpxz http://cdjstuz.info/?cdjstuz >> cdjstuz https://adivy.com/?adivy adivy >> https://alnrty.ru/?alnrty alnrty https://hkmntz.info/?hkmntz >> hkmntz https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv >> bfntv https://inuvwyz.ru/?inuvwyz inuvwyz >> https://jkmrvyz.ua/?jkmrvyz jkmrvyz >> http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos >> cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt >> defrt https://afghsvy.ua/?afghsvy afghsvy >> http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz >> mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz >> http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv >> dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz >> aijvxz https://abdhrsx.net/?abdhrsx abdhrsx >> http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow >> fjkow https://fkruvx.biz/?fkruvx fkruvx >> http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops >> djkmops https://cequyz.ru/?cequyz cequyz https://hmswz >> ?.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu >> http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy >> adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps >> http://dhmostwy.net/?dhmostwy dhmostwy >> https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor >> bilnor https://fnqxz.com/?fnqxz fnqxz >> https://bijnuwz.com/?bijnuwz bijnuwz >> http://abcgikwy.ua/?abcgikwy abcgikwy >> https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv >> jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs >> dgiqs http://bgiprx.net/?bgiprx bgiprx >> https://cdklmqv.biz/?cdklmqv cdklmqv >> https://adgnowz.ua/?adgnowz adgnowz >> http://cfjnorty.ru/?cfjnorty cfjnorty >> https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy >> http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu >> https://fgjoquvz.ru/?fgjoquvz fgjoquvz >> http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty >> dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy >> https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry >> cdikry https://bfhkrty.info/?bfhkrty bfhkrty https:/ >> ?/cekpst.com/?cekpst cekpst >> https://ceknpx.ru/?ceknpx ceknpx https://gjpwy.com/?gjpwy >> gjpwy https://ceouwxyz.ru/?ceouwxyz ceouwxyz >> https://ijkopstu.ua/?ijkopstu ijkopstu >> https://acegjlmo.net/?acegjlmo acegjlmo >> http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz >> fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz >> http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv >> dpqrsv https://acehklps.ua/?acehklps acehklps >> http://aghsuvyz.biz/?aghsuvyz aghsuvyz >> https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt >> http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx >> fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz >> abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx >> https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu >> cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz >> https://demnpst.biz/?demnpst demnpst >> https://bghikms.biz/?bghikms bghikms >> http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz >> npqrwz http://adehilpq.net/?adehilpq adehilpq http://bnortwy. >> ?com/?bnortwy bnortwy http://cegpsuz.ru/?cegpsuz cegpsuz >> http://aksyz.ua/?aksyz aksyz http://egkmosux.biz/?egkmosux >> egkmosux https://dimoruy.com/?dimoruy dimoruy >> https://ehopyz.net/?ehopyz ehopyz >> https://ehijptuy.com/?ehijptuy ehijptuy >> http://jklmprtv.ua/?jklmprtv jklmprtv >> http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu >> abcfqu https://degijyz.ru/?degijyz degijyz >> https://cikortuy.net/?cikortuy cikortuy >> http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt >> bfgknpqt https://afmqtv.ru/?afmqtv afmqtv >> https://bertuvwz.info/?bertuvwz bertuvwz >> https://eipuvwxy.com/?eipuvwxy eipuvwxy >> http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx >> acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs >> cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz >> bktxz http://adegmovy.biz/?adegmovy adegmovy >> http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny >> https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz >> behknqtz http://efhjquwx.biz/?efhjquwx e >> ?fhjquwx https://djmnosz.com/?djmnosz djmnosz >> http://adino.com/?adino adino http://djlmouv.ua/?djlmouv >> djlmouv http://dehrvy.biz/?dehrvy dehrvy >> http://cefijlu.com/?cefijlu cefijlu http://bfgitz.ua/?bfgitz >> bfgitz http://ilmqsz.ua/?ilmqsz ilmqsz >> http://aejnrwy.ru/?aejnrwy aejnrwy https://lnuwz.ru/?lnuwz >> lnuwz http://jlnrv.com/?jlnrv jlnrv https://bjstxz.ru/?bjstxz >> bjstxz https://fhjnw.com/?fhjnw fhjnw >> https://ajmnuz.com/?ajmnuz ajmnuz https://ghptxy.net/?ghptxy >> ghptxy https://aceirsxy.ua/?aceirsxy aceirsxy >> http://bcemosuw.info/?bcemosuw bcemosuw >> https://cgklpxy.ua/?cgklpxy cgklpxy >> https://deghijrs.net/?deghijrs deghijrs >> http://cgjknvz.ru/?cgjknvz cgjknvz >> https://bcikmswz.net/?bcikmswz bcikmswz >> https://cginosuy.com/?cginosuy cginosuy >> http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry >> dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz >> https://dfglmp.ru/?dfglmp dfglmp >> https://cfgmnpru.ua/?cfgmnpru cfgmnpru >> https://cefhlns.ru/?cefhlns cefhlns >> http://bcelnosz.ua/?bcelnosz bcelnosz https:/ >> ?/cehnsy.net/?cehnsy cehnsy >> https://ipuxyz.info/?ipuxyz ipuxyz >> https://dfghintv.net/?dfghintv dfghintv >> https://abdhkuxy.com/?abdhkuxy abdhkuxy >> https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw >> fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz >> http://cefkqrx.info/?cefkqrx cefkqrx >> http://bdjkstvz.biz/?bdjkstvz bdjkstvz >> https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz >> cfhostz http://jnqsv.ua/?jnqsv jnqsv >> http://abcfjmps.net/?abcfjmps abcfjmps >> https://jlnptu.com/?jlnptu jlnptu >> https://bdghklns.ua/?bdghklns bdghklns >> https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs >> hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy >> http://cemnsvz.net/?cemnsvz cemnsvz >> http://cfhknu.info/?cfhknu cfhknu >> http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz >> http://fghjloqu.com/?fghjloqu fghjloqu >> https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz >> imvwyz http://dhklntw.com/?dhklntw dhklntw >> http://adijnouy.info/?adijnouy adijnouy >> https://cdervx.info/?cdervx cdervx >> https://ajkqrtuw.ru/?ajkqrtuw ajkqrtu >> ?w http://diknqtu.biz/?diknqtu diknqtu >> http://hjnrux.ru/?hjnrux hjnrux https://ilmouxy.info/?ilmouxy >> ilmouxy http://adfijm.ua/?adfijm adfijm >> http://efgptx.ru/?efgptx efgptx >> https://dmprtvwy.com/?dmprtvwy dmprtvwy >> http://bgksuv.ua/?bgksuv bgksuv https://cfhsv.com/?cfhsv >> cfhsv http://adgoqr.biz/?adgoqr adgoqr >> http://efilmx.net/?efilmx efilmx http://eglpqrx.ua/?eglpqrx >> eglpqrx http://acequxz.com/?acequxz acequxz >> http://lmsvz.ua/?lmsvz lmsvz http://aefhprtw.com/?aefhprtw >> aefhprtw https://bglnq.info/?bglnq bglnq >> http://befhirv.ua/?befhirv befhirv >> https://achkryz.com/?achkryz achkryz >> https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy >> ejloqy https://biopqz.net/?biopqz biopqz >> https://bfhklotx.biz/?bfhklotx bfhklotx >> https://dgjkqwy.ua/?dgjkqwy dgjkqwy >> https://afnpqrxz.net/?afnpqrxz afnpqrxz >> https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz >> https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz >> cruwz https://bknovz.net/?bknovz bknovz >> http://dhilrtv.ua/?dhilrtv dhilrtv http://chjp >> qvyz.info/?chjpqvyz chjpqvyz >> https://achijmr.com/?achijmr achijmr >> https://chknosty.ru/?chknosty chknosty >> https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy >> defklosy http://kmqrw.com/?kmqrw kmqrw >> https://npqtvz.ru/?npqtvz npqtvz http://cefkrux.ua/?cefkrux >> cefkrux https://aefhivx.ua/?aefhivx aefhivx >> https://dijlrtuy.ua/?dijlrtuy dijlrtuy >> https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz >> cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr >> http://beimquxy.ua/?beimquxy beimquxy >> http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty >> https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx >> https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw >> aklnw http://bjnrwx.net/?bjnrwx bjnrwx >> http://defjlqvy.com/?defjlqvy defjlqvy >> http://dfimopvz.ua/?dfimopvz dfimopvz >> http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq >> https://blnrwyz.info/?blnrwyz blnrwyz >> http://fklnrtvx.info/?fklnrtvx fklnrtvx >> https://fghnruz.biz/?fghnruz fghnruz https://bijqsu.r >> ?u/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz ghoruxyz >> http://acejknry.ru/?acejknry acejknry >> http://cehquvxy.net/?cehquvxy cehquvxy >> https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs >> https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux >> bmoqux https://aijklm.com/?aijklm aijklm >> https://clmptx.info/?clmptx clmptx >> https://dghmpqtz.net/?dghmpqtz dghmpqtz >> https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw >> egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx >> http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv >> chnpstv https://eknptuw.ru/?eknptuw eknptuw >> https://jkmrst.net/?jkmrst jkmrst >> http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz >> http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy >> https://bcdfgkty.ru/?bcdfgkty bcdfgkty >> http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz >> dhilsz https://acekpsy.ru/?acekpsy acekpsy >> http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr >> cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz >> https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv >> lmpstv https >> ?://ilmory.ua/?ilmory ilmory >> https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz >> bceotz https://ghjlqt.info/?ghjlqt ghjlqt >> https://hmosyz.ua/?hmosyz hmosyz >> https://behmoqru.net/?behmoqru behmoqru >> https://chinpx.info/?chinpx chinpx >> https://fkntuvyz.info/?fkntuvyz fkntuvyz >> http://cfghimtw.ua/?cfghimtw cfghimtw >> https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw >> gimvw https://degixy.ua/?degixy degixy >> http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz >> fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy >> http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr >> acdfhjkr https://bcfktw.biz/?bcfktw bcfktw >> https://fkmoqux.biz/?fkmoqux fkmoqux >> http://adghjox.ru/?adghjox adghjox >> https://dlnortv.info/?dlnortv dlnortv >> http://ijlpsuw.ru/?ijlpsuw ijlpsuw >> https://cgijqruw.info/?cgijqruw cgijqruw >> https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw >> dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu >> http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv >> degitv ht >> ?tps://ghjlmr.net/?ghjlmr ghjlmr >> https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw >> gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz >> http://dgklquv.ua/?dgklquv dgklquv >> http://cghijnv.info/?cghijnv cghijnv >> https://aelmoy.com/?aelmoy aelmoy >> http://begilmnx.com/?begilmnx begilmnx >> http://bcegrtxz.ua/?bcegrtxz bcegrtxz >> https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu >> cefgpstu http://gmntv.com/?gmntv gmntv >> http://afjnstuy.ua/?afjnstuy afjnstuy >> https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz >> ioswz https://gorvw.info/?gorvw gorvw >> http://cgnouwy.ua/?cgnouwy cgnouwy >> http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl >> egijl https://gorstuxz.net/?gorstuxz gorstuxz >> https://cdhjnopu.biz/?cdhjnopu cdhjnopu >> http://bceijoy.ru/?bceijoy bceijoy >> https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw >> bdeltw http://jktwz.net/?jktwz jktwz >> https://bgikloqy.info/?bgikloqy bgikloqy >> https://eimqrsuw.biz/?eimqrsuw eimqrsuw >> http://cfgkpu.net/?cfgkpu cfgkpu http://efm >> qrtv.com/?efmqrtv efmqrtv >> http://cegikmru.info/?cegikmru cegikmru >> http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw >> efiorw https://ehkpy.ru/?ehkpy ehkpy >> http://acdjps.biz/?acdjps acdjps >> http://lmnopvxy.net/?lmnopvxy lmnopvxy >> http://abeptuy.net/?abeptuy abeptuy http://fimnoq.ua/?fimnoq >> fimnoq http://adefky.com/?adefky adefky >> https://cdfijlrz.ru/?cdfijlrz cdfijlrz >> https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz >> http://ahkpruvw.ru/?ahkpruvw ahkpruvw >> https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv >> bclnpqtv https://hmptuy.ua/?hmptuy hmptuy >> https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp >> cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz >> http://enoprsxz.ru/?enoprsxz enoprsxz >> https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx >> ijopuwx https://dghoryz.ua/?dghoryz dghoryz >> https://gnpqsvy.biz/?gnpqsvy gnpqsvy >> https://jlmtwxy.ua/?jlmtwxy jlmtwxy >> http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx >> gprtvx http://bklptuxz.net/?bk >> ?lptuxz bklptuxz https://bfknrvz.com/?bfknrvz bfknrvz >> http://dgkqrv.ua/?dgkqrv dgkqrv >> https://adkprstv.net/?adkprstv adkprstv >> https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt >> bchiknpt http://hjklprx.biz/?hjklprx hjklprx >> http://bejoruw.net/?bejoruw bejoruw >> http://cegmnvz.biz/?cegmnvz cegmnvz >> http://gmnotyz.ua/?gmnotyz gmnotyz >> https://dfhikntw.net/?dfhikntw dfhikntw >> https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz >> cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz >> http://behlrwy.com/?behlrwy behlrwy >> http://drsuvxz.biz/?drsuvxz drsuvxz >> https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu >> fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz >> https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv >> dhjpsuv http://dfopruy.ru/?dfopruy dfopruy >> https://ehkloqsz.info/?ehkloqsz ehkloqsz >> https://adgjqru.biz/?adgjqru adgjqru >> https://ejmtvwz.biz/?ejmtvwz ejmtvwz >> http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry >> fhikqry https://cemqy.com/?cemqy cemqy http: >> ?//otuxz.com/?otuxz otuxz >> http://fijlmnoq.info/?fijlmnoq fijlmnoq >> http://cgiouw.ru/?cgiouw cgiouw >> https://cdgijlnw.biz/?cdgijlnw cdgijlnw >> https://gmuvw.biz/?gmuvw gmuvw >> https://abdefgiu.info/?abdefgiu abdefgiu >> https://gkmqwy.info/?gkmqwy gkmqwy >> http://bfgkqstw.com/?bfgkqstw bfgkqstw >> https://bfgmnovx.net/?bfgmnovx bfgmnovx >> https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx >> chiuwx http://chlvw.ua/?chlvw chlvw >> https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv >> bghosv https://befgkpst.net/?befgkpst befgkpst >> http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz >> dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz >> https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw >> bfjorw http://cfijuz.biz/?cfijuz cfijuz >> https://efjlnuv.net/?efjlnuv efjlnuv >> https://hkruvy.net/?hkruvy hkruvy >> http://defnswxy.biz/?defnswxy defnswxy >> https://fgjmqxy.biz/?fgjmqxy fgjmqxy >> https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr >> cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy htt >> ?ps://chjmosxy.biz/?chjmosxy >> chjmosxy https://abcdhln.net/?abcdhln abcdhln >> https://elstuyz.biz/?elstuyz elstuyz >> https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv >> bcdgquv http://afiklst.com/?afiklst afiklst >> https://agnvwz.info/?agnvwz agnvwz >> http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx >> https://bdjmnuz.biz/?bdjmnuz bdjmnuz >> https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt >> egklqt http://bimqr.com/?bimqr bimqr >> https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz >> dvwxz https://dfghimu.com/?dfghimu dfghimu >> https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux >> fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu >> http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu >> http://eflntvyz.ru/?eflntvyz eflntvyz >> https://abcdpsx.ru/?abcdpsx abcdpsx >> https://cfipqx.info/?cfipqx cfipqx >> http://adforst.biz/?adforst adforst >> https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv >> http://cfkmnorv.ua/?cfkmnorv cfkmnorv >> http://dhjkuv.info/?dhjkuv dhjkuv >> https://bcejlsyz.info/?bcejlsyz >> ?bcejlsyz https://efinrsy.ua/?efinrsy efinrsy >> https://kmosw.net/?kmosw kmosw http://jmoxy.net/?jmoxy jmoxy >> http://agjloqy.biz/?agjloqy agjloqy >> https://dglmnpqy.com/?dglmnpqy dglmnpqy >> https://dijnqrsw.ua/?dijnqrsw dijnqrsw >> http://ijmnr.biz/?ijmnr ijmnr https://gjouwxy.ru/?gjouwxy >> gjouwxy https://klpuz.biz/?klpuz klpuz >> http://hijoux.net/?hijoux hijoux http://bikmn.net/?bikmn >> bikmn http://abcovwy.ua/?abcovwy abcovwy >> https://adfgioqx.ua/?adfgioqx adfgioqx >> http://beijknpq.info/?beijknpq beijknpq >> https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr >> http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz >> gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu >> https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux >> cgijqrux https://fklnovy.biz/?fklnovy fklnovy >> http://chkmosux.net/?chkmosux chkmosux >> https://cenprvw.info/?cenprvw cenprvw >> https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy >> bdgopy https://ampwx.info/?ampwx ampwx >> http://cefmoqtx.info/?cefmoqtx cefmoqtx htt >> ?ps://bcdflny.ua/?bcdflny >> bcdflny https://abqtux.ua/?abqtux abqtux >> https://bpstw.com/?bpstw bpstw http://cegmoqry.info/?cegmoqry >> cegmoqry http://aeghjlw.net/?aeghjlw aeghjlw >> https://egjswz.ru/?egjswz egjswz https://agiluv.info/?agiluv >> agiluv https://befimsu.biz/?befimsu befimsu >> https://abcdeiry.info/?abcdeiry abcdeiry >> http://bcqrxz.info/?bcqrxz bcqrxz >> https://eilntvxz.net/?eilntvxz eilntvxz >> https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox >> adglox https://adjnptw.ru/?adjnptw adjnptw >> http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru >> abeqru https://cnoprsw.biz/?cnoprsw cnoprsw >> http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz >> ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz >> https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz >> lmntwxz https://blopqxz.ua/?blopqxz blopqxz >> https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz >> befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr >> https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw >> hinsw http:/ >> ?/abfjklsy.net/?abfjklsy >> abfjklsy https://bhimnrwx.biz/?bhimnrwx bhimnrwx >> http://dejnotz.ru/?dejnotz dejnotz >> https://bcflsxz.info/?bcflsxz bcflsxz >> http://hklmsux.net/?hklmsux hklmsux >> https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy >> cfknxy http://acfntvx.net/?acfntvx acfntvx >> https://cdfksuvy.info/?cdfksuvy cdfksuvy >> https://abjkorx.ru/?abjkorx abjkorx >> https://afiklqvw.com/?afiklqvw afiklqvw >> http://adnovx.biz/?adnovx adnovx >> https://fglstyz.info/?fglstyz fglstyz >> https://ghikruy.biz/?ghikruy ghikruy >> http://ghnopsvz.biz/?ghnopsvz ghnopsvz >> http://abcgjnrw.com/?abcgjnrw abcgjnrw >> https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv >> ahiqrv https://ajotw.biz/?ajotw ajotw >> http://bfhipuw.info/?bfhipuw bfhipuw >> http://acjmqry.ua/?acjmqry acjmqry >> http://efjnoqrz.info/?efjnoqrz efjnoqrz >> https://achkqst.info/?achkqst achkqst >> http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv >> bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu >> http://ekopq.ru/?ekopq ekopq https://abfkrsz.u >> ?a/?abfkrsz abfkrsz https://abenxy.net/?abenxy abenxy >> http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr >> bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps >> eimps https://abdstwz.net/?abdstwz abdstwz >> https://cdhjlu.biz/?cdhjlu cdhjlu >> http://gmrstyz.info/?gmrstyz gmrstyz >> https://djsvwyz.ua/?djsvwyz djsvwyz >> http://begjmorv.net/?begjmorv begjmorv >> http://ahlmoptz.biz/?ahlmoptz ahlmoptz >> http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs >> http://cjklmr.biz/?cjklmr cjklmr >> https://aefgpquv.biz/?aefgpquv aefgpquv >> https://jlmopswx.net/?jlmopswx jlmopswx >> https://bfhjlsw.biz/?bfhjlsw bfhjlsw >> https://iknortux.com/?iknortux iknortux >> http://abhkoprv.biz/?abhkoprv abhkoprv >> https://cjnrwyz.ua/?cjnrwyz cjnrwyz >> http://djmnrvwz.info/?djmnrvwz djmnrvwz >> http://adgkrtyz.info/?adgkrtyz adgkrtyz >> https://bdhjsuv.ua/?bdhjsuv bdhjsuv >> http://acfgjor.biz/?acfgjor acfgjor >> http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz >> fhijz http://hkmntxy.ru/?hkmntxy hkmntxy http://fh >> ilsuy.ua/?fhilsuy fhilsuy >> http://jmnpwx.ru/?jmnpwx jmnpwx http://gikltvwy.ru/?gikltvwy >> gikltvwy http://jopqy.biz/?jopqy jopqy >> http://bdlqry.info/?bdlqry bdlqry https://ejlvwz.com/?ejlvwz >> ejlvwz http://ceijvz.net/?ceijvz ceijvz >> http://cfijntyz.info/?cfijntyz cfijntyz >> http://efilotuv.com/?efilotuv efilotuv >> http://cfmpyz.com/?cfmpyz cfmpyz http://dhpqz.com/?dhpqz >> dhpqz http://dlpxz.com/?dlpxz dlpxz >> http://aefjlyz.net/?aefjlyz aefjlyz >> https://ceijorz.info/?ceijorz ceijorz >> http://dkmnuvx.info/?dkmnuvx dkmnuvx >> https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz >> dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt >> http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu >> cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz >> dmorz https://celmrwz.info/?celmrwz celmrwz >> https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz >> cefjkxz http://behjlv.biz/?behjlv behjlv >> http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz >> ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz https://gijl >> nqrs.biz/?gijlnqrs gijlnqrs >> http://adfjlrsz.info/?adfjlrsz adfjlrsz >> https://afiptvz.biz/?afiptvz afiptvz >> http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq >> dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz >> https://dejkq.com/?dejkq dejkq http://abcltx.com/?abcltx >> abcltx https://bdfjopu.net/?bdfjopu bdfjopu >> http://bipvy.com/?bipvy bipvy http://dlqsx.biz/?dlqsx dlqsx >> http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz >> dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu >> https://akmotuv.info/?akmotuv akmotuv >> http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz >> bimxz http://afjoqyz.ru/?afjoqyz afjoqyz >> https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw >> bmptw http://fgltwx.com/?fgltwx fgltwx >> http://chlnpqrs.com/?chlnpqrs chlnpqrs >> https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw >> adfhmw http://bgilt.com/?bgilt bgilt >> https://cefotu.biz/?cefotu cefotu >> https://adfruvwy.biz/?adfruvwy adfruvwy >> https://fgoqstu.info/?fgoqstu fgoqstu >> https://cdilnqvy.info/?cdilnqvy >> ?cdilnqvy https://adghruw.ua/?adghruw adghruw >> https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy >> http://abfijlnr.info/?abfijlnr abfijlnr >> https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv >> https://ginoxy.ua/?ginoxy ginoxy >> https://ciknpquy.net/?ciknpquy ciknpquy >> https://egijstw.biz/?egijstw egijstw >> http://cdikopvz.info/?cdikopvz cdikopvz >> https://egjklnvy.info/?egjklnvy egjklnvy >> https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv >> dfkmv https://efgloptu.ru/?efgloptu efgloptu >> https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv >> efhklv https://befmrwy.info/?befmrwy befmrwy >> https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz >> flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu >> https://fjrsty.info/?fjrsty fjrsty >> http://behimuwy.ru/?behimuwy behimuwy >> https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp >> cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz >> https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy >> beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz >> http://ehinz.net/?ehin >> ?z ehinz http://gipvz.com/?gipvz gipvz >> http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors >> blors http://dnqry.info/?dnqry dnqry >> https://cghnrux.biz/?cghnrux cghnrux >> http://dfinqtwx.ru/?dfinqtwx dfinqtwx >> http://efotxz.info/?efotxz efotxz >> http://aceglpqu.net/?aceglpqu aceglpqu >> https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw >> http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp >> cdfkp https://aelouvz.ua/?aelouvz aelouvz >> http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw >> cejqw https://bdgkuy.com/?bdgkuy bdgkuy >> https://hkoprvxy.ru/?hkoprvxy hkoprvxy >> https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty >> cfgsty https://bfgptx.net/?bfgptx bfgptx >> http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy >> deilmoy https://kqruv.info/?kqruv kqruv >> https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy >> https://abcmnu.net/?abcmnu abcmnu >> http://adehinxy.ru/?adehinxy adehinxy >> http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst >> http://bfil >> s.ua/?bfils bfils >> http://abcmnvw.biz/?abcmnvw abcmnvw >> https://bchlqsux.net/?bchlqsux bchlqsux >> http://fhikpqvy.net/?fhikpqvy fhikpqvy >> https://bejkqu.info/?bejkqu bejkqu >> https://cdehnsxy.ru/?cdehnsxy cdehnsxy >> http://bcdhnopx.info/?bcdhnopx bcdhnopx >> https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw >> bdeftvw http://dgmnou.ru/?dgmnou dgmnou >> https://aiklnvx.com/?aiklnvx aiklnvx >> http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw >> http://cdehmquy.biz/?cdehmquy cdehmquy >> https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw >> bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz >> https://abesty.info/?abesty abesty >> https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz >> ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv >> acdlv http://fgimqw.net/?fgimqw fgimqw >> https://fgkrxz.info/?fgkrxz fgkrxz >> http://abfhnvxy.ua/?abfhnvxy abfhnvxy >> https://bcdeipsy.ru/?bcdeipsy bcdeipsy >> http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry >> fnpqry https://aklstwx.ru/?aklstwx aklstwx http >> ?://bjloq.com/?bjloq bjloq >> http://dkmsuv.info/?dkmsuv dkmsuv >> http://afinptuw.ru/?afinptuw afinptuw >> https://bijnqvx.biz/?bijnqvx bijnqvx >> https://behlxz.info/?behlxz behlxz >> http://bgjkquyz.biz/?bgjkquyz bgjkquyz >> http://aehnpqw.net/?aehnpqw aehnpqw >> http://cghjmpqu.com/?cghjmpqu cghjmpqu >> http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz >> https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost >> bfgjmost https://bhnou.ua/?bhnou bhnou >> https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx >> abijmrtx http://acefnv.com/?acefnv acefnv >> http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz >> egjltuz http://afghpsv.net/?afghpsv afghpsv >> http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz >> befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw >> https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt >> cegpt https://klntvw.biz/?klntvw klntvw >> https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx >> https://dqrsuvw.com/?dqrsuvw dqrsuvw http://bcekrsu.net/?bcekr >> ?su bcekrsu https://dlmqwy.com/?dlmqwy dlmqwy >> http://abhnsvz.ru/?abhnsvz abhnsvz >> https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr >> http://ceijlprw.net/?ceijlprw ceijlprw >> https://ijlqsw.net/?ijlqsw ijlqsw >> https://einrvxz.net/?einrvxz einrvxz >> https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz >> adkmqsz http://bchnox.com/?bchnox bchnox >> https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops >> dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy >> http://bdnpstu.info/?bdnpstu bdnpstu >> http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx >> gjnrx http://abcsvz.com/?abcsvz abcsvz >> http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz >> fkmouxz http://bcpsx.com/?bcpsx bcpsx >> http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq >> bcdepq http://kqswy.net/?kqswy kqswy >> https://bdgjnuy.ru/?bdgjnuy bdgjnuy >> https://cgiklsvz.info/?cgiklsvz cgiklsvz >> https://bfgnprwy.net/?bfgnprwy bfgnprwy >> https://bejmnqrw.com/?bejmnqrw bejmnqrw >> http://gilopqwy.ua/?gilopqwy gilopqwy https://fjopry.ru/?fjopry >> ?fjopry http://kmrvwy.biz/?kmrvwy kmrvwy >> https://eglotuv.net/?eglotuv eglotuv >> http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx >> http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy >> https://gjoquvwx.net/?gjoquvwx gjoquvwx >> https://abcfouw.biz/?abcfouw abcfouw >> https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz >> ijnruvz http://mqruv.ua/?mqruv mqruv >> https://afklmu.biz/?afklmu afklmu >> https://abcgptyz.biz/?abcgptyz abcgptyz >> https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx >> ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx >> hlnvx http://adiklmp.com/?adiklmp adiklmp >> https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw >> acmopvw http://cdmuw.ua/?cdmuw cdmuw >> http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy >> lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw >> http://bcfhmsz.net/?bcfhmsz bcfhmsz >> http://bcgknpry.biz/?bcgknpry bcgknpry >> https://bflmnoq.info/?bflmnoq bflmnoq >> https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw >> gknosuw http://ajoruvz. >> ?com/?ajoruvz ajoruvz http://cotxz.ru/?cotxz cotxz >> https://deintwx.net/?deintwx deintwx >> https://eghijkuy.com/?eghijkuy eghijkuy >> http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy >> aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx >> http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz >> https://deklmuvw.com/?deklmuvw deklmuvw >> http://cfgimox.biz/?cfgimox cfgimox >> http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy >> acilsy http://bfhlnop.net/?bfhlnop bfhlnop >> https://fijluz.net/?fijluz fijluz >> https://dfjnostv.ua/?dfjnostv dfjnostv >> https://aginsty.net/?aginsty aginsty >> http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu >> agopu http://bdkopst.ua/?bdkopst bdkopst >> https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu >> cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz >> http://ijlmuvw.com/?ijlmuvw ijlmuvw >> https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy >> http://aefhty.info/?aefhty aefhty >> http://afjklsux.net/?afjklsux afjklsux >> http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bd >> ?jkpx bdjkpx http://aciknovw.net/?aciknovw aciknovw >> https://aftuxy.ua/?aftuxy aftuxy >> https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz >> http://fijqrv.com/?fijqrv fijqrv >> https://ahmquvw.info/?ahmquvw ahmquvw >> https://ghklmqvy.com/?ghklmqvy ghklmqvy >> http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz >> aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy >> https://dghnqrty.ru/?dghnqrty dghnqrty >> https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv >> adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz >> http://bcdeghvx.ru/?bcdeghvx bcdeghvx >> http://bdfgruvy.com/?bdfgruvy bdfgruvy >> https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp >> abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz >> http://clqruwxy.com/?clqruwxy clqruwxy >> https://fhikoq.net/?fhikoq fhikoq >> https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz >> http://fhjkor.biz/?fhjkor fhjkor >> https://dfgkouyz.com/?dfgkouyz dfgkouyz >> https://acfhmqtz.info/?acfhmqtz acfhmqtz >> https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu >> dfgnrtu https://diknswx.biz/?diknswx >> ?diknswx https://hjnsz.info/?hjnsz hjnsz >> https://cdjlq.ua/?cdjlq cdjlq https://iowxy.ru/?iowxy iowxy >> https://fhjltux.net/?fhjltux fhjltux http://ablmv.com/?ablmv >> ablmv http://bdiorswz.ru/?bdiorswz bdiorswz >> http://cnopr.ru/?cnopr cnopr http://adgopz.ru/?adgopz adgopz >> http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr >> ghmqr https://cdetv.ru/?cdetv cdetv >> https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv >> djopv http://aiknorw.biz/?aiknorw aiknorw >> https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox >> https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq >> fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv >> dfhquv https://bfgjoq.com/?bfgjoq bfgjoq >> https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv >> cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz >> https://ghijkstx.info/?ghijkstx ghijkstx >> http://abilmsz.info/?abilmsz abilmsz >> https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv >> eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx https://acgkmnpu.c >> ?om/?acgkmnpu acgkmnpu http://elrty.ru/?elrty elrty >> https://cdgkoz.net/?cdgkoz cdgkoz >> http://dimnosw.info/?dimnosw dimnosw >> http://abegnwx.biz/?abegnwx abegnwx >> http://abejnsx.net/?abejnsx abejnsx >> https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz >> abfsz https://imqrvy.net/?imqrvy imqrvy >> https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv >> http://bdfostyz.net/?bdfostyz bdfostyz >> https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw >> afgruvw https://cefivyz.com/?cefivyz cefivyz >> https://gmqrt.com/?gmqrt gmqrt >> https://acdimqrw.info/?acdimqrw acdimqrw >> http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs >> https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx >> adfjstx https://cghlqt.net/?cghlqt cghlqt >> http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv >> ahluv http://acdfgn.net/?acdfgn acdfgn >> https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy >> ghnuvy http://aeiky.com/?aeiky aeiky >> https://bdilnov.info/?bdilnov bdilnov >> https://cjlnpqru.biz/?cjlnpqru cjlnpqru https: >> ?//cgjqtv.biz/?cgjqtv cgjqtv >> https://abcnsvz.biz/?abcnsvz abcnsvz >> http://abegopsu.biz/?abegopsu abegopsu >> https://bcmstuvy.net/?bcmstuvy bcmstuvy >> https://bhjknsuz.net/?bhjknsuz bhjknsuz >> http://aceklno.info/?aceklno aceklno >> http://diotvyz.ru/?diotvyz diotvyz >> https://aefkmruw.info/?aefkmruw aefkmruw >> https://cdghlmnz.info/?cdghlmnz cdghlmnz >> https://bcgijmow.net/?bcgijmow bcgijmow >> http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt >> bhjnrt https://eimoptu.ua/?eimoptu eimoptu >> https://achptv.info/?achptv achptv https://boptux.biz/?boptux >> boptux http://acmovz.ua/?acmovz acmovz >> https://abfmnpw.info/?abfmnpw abfmnpw >> https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps >> ehnps https://dnpuw.biz/?dnpuw dnpuw >> https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz >> fmuyz http://bejkvz.info/?bejkvz bejkvz >> https://dfkmnyz.ua/?dfkmnyz dfkmnyz >> http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw >> abdjsw https://abelrvx.biz/?abelrvx abelrvx >> https://abjmoy.ru/?abjmoy abjmo >> ?y http://hmnqz.net/?hmnqz hmnqz https://dhkntv.ru/?dhkntv >> dhkntv http://bgqsyz.biz/?bgqsyz bgqsyz >> http://eghkmsux.net/?eghkmsux eghkmsux >> https://acehlrsv.info/?acehlrsv acehlrsv >> https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu >> cefmtu https://hjnqrt.net/?hjnqrt hjnqrt >> http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu >> cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr >> cdfqr http://adelmvz.ru/?adelmvz adelmvz >> http://abflnsvx.com/?abflnsvx abflnsvx >> http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux >> fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz >> http://cdflstwz.net/?cdflstwz cdflstwz >> https://egjostux.ua/?egjostux egjostux >> https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz >> bcfgmz http://dfnouz.ua/?dfnouz dfnouz >> http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw >> dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl >> dijkl http://aeglu.com/?aeglu aeglu >> http://dgikmpy.info/?dgikmpy dgikmpy >> https://brsuz.info/?brsuz brsuz http >> ?://acfglopu.ru/?acfglopu >> acfglopu https://cdjqx.net/?cdjqx cdjqx >> https://aefimst.biz/?aefimst aefimst >> http://bfikmuw.net/?bfikmuw bfikmuw >> http://abhioqvw.ua/?abhioqvw abhioqvw >> http://abdilmo.net/?abdilmo abdilmo >> https://bhlovx.info/?bhlovx bhlovx >> https://fgvwxz.info/?fgvwxz fgvwxz >> http://afhjpqsv.biz/?afhjpqsv afhjpqsv >> http://deilrsv.com/?deilrsv deilrsv >> http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw >> bfkmtw https://adkxy.ru/?adkxy adkxy >> https://blpqxy.com/?blpqxy blpqxy >> https://efgilor.net/?efgilor efgilor >> https://abcdky.info/?abcdky abcdky >> https://befgikqu.ru/?befgikqu befgikqu >> https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu >> fjlsu https://kpquw.com/?kpquw kpquw >> https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy >> http://dijopsz.biz/?dijopsz dijopsz >> http://acfhrsz.ua/?acfhrsz acfhrsz >> https://bfhrstvy.ua/?bfhrstvy bfhrstvy >> http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw >> bcjqrvw https://hiotz.ru/?hiotz hiotz >> https://abefmswx.biz/?abefmswx abefmswx ht >> ?tp://acgkopyz.ua/?acgkopyz >> acgkopyz https://fijln.com/?fijln fijln >> http://befmquwz.info/?befmquwz befmquwz >> https://hijory.ua/?hijory hijory http://cdfhr.info/?cdfhr >> cdfhr https://ahiksx.ua/?ahiksx ahiksx >> https://aghmxy.ru/?aghmxy aghmxy https://hnopqt.net/?hnopqt >> hnopqt https://fiklnrsw.ru/?fiklnrsw fiklnrsw >> http://hknpv.ru/?hknpv hknpv https://abdfmotw.net/?abdfmotw >> abdfmotw http://bcfhpvx.biz/?bcfhpvx bcfhpvx >> http://beimrty.net/?beimrty beimrty >> http://dgnsvwy.ua/?dgnsvwy dgnsvwy >> https://aghjstvw.com/?aghjstvw aghjstvw >> http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop >> finop http://bcdhsy.biz/?bcdhsy bcdhsy >> http://cfhmnvy.net/?cfhmnvy cfhmnvy >> https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou >> akmou https://fgkntuw.biz/?fgkntuw fgkntuw >> https://adfjsw.com/?adfjsw adfjsw >> https://bdjmrsty.biz/?bdjmrsty bdjmrsty >> http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw >> http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz >> dmnrz http://deglqtu.com/?deglq >> ?tu deglqtu http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw >> https://gnoqvxz.com/?gnoqvxz gnoqvxz >> https://hijkmsu.ru/?hijkmsu hijkmsu >> http://cistxyz.biz/?cistxyz cistxyz >> https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy >> hmqrwy https://crsuv.ru/?crsuv crsuv >> http://abejkmoz.ua/?abejkmoz abejkmoz >> http://abjpqu.ua/?abjpqu abjpqu >> https://ehjkopuz.info/?ehjkopuz ehjkopuz >> http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx >> emntvwx http://eghksx.info/?eghksx eghksx >> https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv >> cinopuv http://efijnptz.biz/?efijnptz efijnptz >> http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw >> denprw http://adglpwz.net/?adglpwz adglpwz >> http://dehimosu.biz/?dehimosu dehimosu >> https://abchprwy.com/?abchprwy abchprwy >> http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz >> cehnwyz http://aghlopy.ua/?aghlopy aghlopy >> http://fhjksuvz.com/?fhjksuvz fhjksuvz >> http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw >> https://emnprwyz.info/?em >> ?nprwyz emnprwyz http://dgmnw.ru/?dgmnw dgmnw >> https://cdgimruv.com/?cdgimruv cdgimruv >> http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv.ru/?lmquv >> lmquv http://degiptv.info/?degiptv degiptv >> http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx >> gklmvx http://behlpstz.com/?behlpstz behlpstz >> https://efginz.com/?efginz efginz >> https://deilmnru.biz/?deilmnru deilmnru >> http://bdehoqux.ru/?bdehoqux bdehoqux >> http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw >> https://cfmtvy.net/?cfmtvy cfmtvy >> http://agkmqvxy.biz/?agkmqvxy agkmqvxy >> https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz >> ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz >> http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx >> http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy >> bgimy https://acdfhimy.info/?acdfhimy acdfhimy >> http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux >> lnopux https://iknqstux.ua/?iknqstux iknqstux >> https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv >> abflsv http://c >> ghkmtvz.info/?cghkmtvz >> cghkmtvz http://flpqrxz.biz/?flpqrxz flpqrxz >> https://cfgnvz.ua/?cfgnvz cfgnvz https://jtuvx.com/?jtuvx >> jtuvx https://fgikl.ru/?fgikl fgikl http://ehikrs.ru/?ehikrs >> ehikrs http://beghqx.ru/?beghqx beghqx >> http://dhijlrsw.com/?dhijlrsw dhijlrsw http://elqst.ru/?elqst >> elqst https://abfklmns.biz/?abfklmns abfklmns >> https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu >> anoqu http://adikquz.com/?adikquz adikquz >> http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz >> fiqwxz http://bcivw.ru/?bcivw bcivw >> https://acdfnrz.ua/?acdfnrz acdfnrz >> https://jmopqtw.info/?jmopqtw jmopqtw >> https://abdeghtv.biz/?abdeghtv abdeghtv >> http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy >> forvy https://abgjknpv.com/?abgjknpv abgjknpv >> http://bcdgimr.info/?bcdgimr bcdgimr >> https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx >> bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw >> http://ekmouvwy.biz/?ekmouvwy ekmouvwy >> http://cijpqsxz.biz/?cijpqsxz cijpqsxz http://dopsty >> z.ua/?dopstyz dopstyz >> http://cinow.com/?cinow cinow http://cdejmrs.info/?cdejmrs >> cdejmrs http://fgknprwz.info/?fgknprwz fgknprwz >> https://binpquxy.com/?binpquxy binpquxy >> https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv >> cemqtv http://bgnqsz.info/?bgnqsz bgnqsz >> http://cdfjprsv.ua/?cdfjprsv cdfjprsv >> https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw >> adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw >> ehkmnw http://kmsxy.ua/?kmsxy kmsxy >> http://benotvw.info/?benotvw benotvw >> https://dhinops.biz/?dhinops dhinops >> https://hklnpqtv.ru/?hklnpqtv hklnpqtv >> https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx >> https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp >> cdghjp http://abchnoqx.com/?abchnoqx abchnoqx >> http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy >> acdnqwy http://befilpv.com/?befilpv befilpv >> http://fijrux.biz/?fijrux fijrux >> https://aknqrtuz.ru/?aknqrtuz aknqrtuz >> http://abfoqwz.ru/?abfoqwz abfoqwz >> https://defikv.info/?defikv defikv http://abcoqtz.c >> ?om/?abcoqtz abcoqtz https://dfirxz.net/?dfirxz dfirxz >> http://bdfipqux.net/?bdfipqux bdfipqux >> http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl >> http://ahikmpqt.info/?ahikmpqt ahikmpqt >> https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz >> afhosxz https://ehsvx.ru/?ehsvx ehsvx >> http://gknquvw.info/?gknquvw gknquvw >> https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy >> fhopy https://bcmpux.net/?bcmpux bcmpux >> https://adfmrsuy.info/?adfmrsuy adfmrsuy >> https://fmtuvx.net/?fmtuvx fmtuvx >> https://defjmswy.net/?defjmswy defjmswy >> https://ahijmrsz.info/?ahijmrsz ahijmrsz >> https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz >> https://behioqyz.ru/?behioqyz behioqyz >> https://admrsvxz.com/?admrsvxz admrsvxz >> https://cefhipqw.com/?cefhipqw cefhipqw >> http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz >> acfgksz http://hjlmqst.info/?hjlmqst hjlmqst >> http://efgnrsux.net/?efgnrsux efgnrsux >> http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu >> acopu https://chi >> lnqrs.ru/?chilnqrs chilnqrs >> https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy >> gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz >> http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr >> dfghilr http://bdlqstv.com/?bdlqstv bdlqstv >> http://adhjnrsz.com/?adhjnrsz adhjnrsz >> http://adiopqx.com/?adiopqx adiopqx >> http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx >> dgpsx http://afgrt.ru/?afgrt afgrt >> http://bghipuvz.ua/?bghipuvz bghipuvz >> http://egjmvw.com/?egjmvw egjmvw >> https://eginoswz.com/?eginoswz eginoswz >> http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno >> http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz >> aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy >> kruwy http://fhlqty.net/?fhlqty fhlqty >> http://chpuvyz.ru/?chpuvyz chpuvyz >> https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw >> cklsw https://fghkmvy.info/?fghkmvy fghkmvy >> http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz >> eimpxz http://cdjstuz.info/?cdjstuz cdjstu >> ?z https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty >> alnrty https://hkmntz.info/?hkmntz hkmntz >> https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv >> https://inuvwyz.ru/?inuvwyz inuvwyz >> https://jkmrvyz.ua/?jkmrvyz jkmrvyz >> http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos >> cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt >> defrt https://afghsvy.ua/?afghsvy afghsvy >> http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz >> mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz >> http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv >> dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz >> aijvxz https://abdhrsx.net/?abdhrsx abdhrsx >> http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow >> fjkow https://fkruvx.biz/?fkruvx fkruvx >> http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops >> djkmops https://cequyz.ru/?cequyz cequyz >> https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu >> http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy ade >> ?jrxy http://cdgikmps.ru/?cdgikmps cdgikmps >> http://dhmostwy.net/?dhmostwy dhmostwy >> https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor >> bilnor https://fnqxz.com/?fnqxz fnqxz >> https://bijnuwz.com/?bijnuwz bijnuwz >> http://abcgikwy.ua/?abcgikwy abcgikwy >> https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv >> jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs >> dgiqs http://bgiprx.net/?bgiprx bgiprx >> https://cdklmqv.biz/?cdklmqv cdklmqv >> https://adgnowz.ua/?adgnowz adgnowz >> http://cfjnorty.ru/?cfjnorty cfjnorty >> https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy >> http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu >> https://fgjoquvz.ru/?fgjoquvz fgjoquvz >> http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty >> dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy >> https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry >> cdikry https://bfhkrty.info/?bfhkrty bfhkrty >> https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx >> ceknpx https://gjpwy.com/?gjpwy gjpwy https://ceouwxyz.ru >> ?/?ceouwxyz ceouwxyz https://ijkopstu.ua/?ijkopstu ijkopstu >> https://acegjlmo.net/?acegjlmo acegjlmo >> http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz >> fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz >> http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv >> dpqrsv https://acehklps.ua/?acehklps acehklps >> http://aghsuvyz.biz/?aghsuvyz aghsuvyz >> https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt >> http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx >> fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz >> abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx >> https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu >> cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz >> https://demnpst.biz/?demnpst demnpst >> https://bghikms.biz/?bghikms bghikms >> http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz >> npqrwz http://adehilpq.net/?adehilpq adehilpq >> http://bnortwy.com/?bnortwy bnortwy >> http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz >> aksyz http://egkmosux.biz/?egkm >> ?osux egkmosux https://dimoruy.com/?dimoruy dimoruy >> https://ehopyz.net/?ehopyz ehopyz >> https://ehijptuy.com/?ehijptuy ehijptuy >> http://jklmprtv.ua/?jklmprtv jklmprtv >> http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu >> abcfqu https://degijyz.ru/?degijyz degijyz >> https://cikortuy.net/?cikortuy cikortuy >> http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt >> bfgknpqt https://afmqtv.ru/?afmqtv afmqtv >> https://bertuvwz.info/?bertuvwz bertuvwz >> https://eipuvwxy.com/?eipuvwxy eipuvwxy >> http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx >> acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs >> cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz >> bktxz http://adegmovy.biz/?adegmovy adegmovy >> http://fkxyz.ua/?fkxyz fkxyz http://adehinxy.ru/?adehinxy >> adehinxy http://ahmnwy.biz/?ahmnwy ahmnwy >> http://morst.ru/?morst morst http://bfils.ua/?bfils bfils >> http://abcmnvw.biz/?abcmnvw abcmnvw >> https://bchlqsux.net/?bchlqsux bchlqsux >> http://fhikpqvy.net/?fhikpqvy fhikpqvy h >> ?ttps://bejkqu.info/?bejkqu >> bejkqu https://cdehnsxy.ru/?cdehnsxy cdehnsxy >> http://bcdhnopx.info/?bcdhnopx bcdhnopx >> https://elpuz.ua/?elpuz elpuz https://bdeftvw.info/?bdeftvw >> bdeftvw http://dgmnou.ru/?dgmnou dgmnou >> https://aiklnvx.com/?aiklnvx aiklnvx >> http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw >> http://cdehmquy.biz/?cdehmquy cdehmquy >> https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw >> bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz >> https://abesty.info/?abesty abesty >> https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz >> ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv >> acdlv http://fgimqw.net/?fgimqw fgimqw >> https://fgkrxz.info/?fgkrxz fgkrxz >> http://abfhnvxy.ua/?abfhnvxy abfhnvxy >> https://bcdeipsy.ru/?bcdeipsy bcdeipsy >> http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry >> fnpqry https://aklstwx.ru/?aklstwx aklstwx >> http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv >> dkmsuv http://afinptuw.ru/?afinptuw afinptuw >> https://bijnqvx.biz/?bijnqvx bijnqvx >> https://behlxz.info/?behlxz behlxz >> http://bgjkquyz.biz/?bgjkquyz bgjkquyz >> http://aehnpqw.net/?aehnpqw aehnpqw >> http://cghjmpqu.com/?cghjmpqu cghjmpqu >> http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz >> https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost >> bfgjmost https://bhnou.ua/?bhnou bhnou >> https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx >> abijmrtx http://acefnv.com/?acefnv acefnv >> http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz >> egjltuz http://afghpsv.net/?afghpsv afghpsv >> http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz >> befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw >> https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt >> cegpt https://klntvw.biz/?klntvw klntvw >> https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx >> https://dqrsuvw.com/?dqrsuvw dqrsuvw >> http://bcekrsu.net/?bcekrsu bcekrsu >> https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz >> abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr http://ceijlp >> rw.net/?ceijlprw ceijlprw >> https://ijlqsw.net/?ijlqsw ijlqsw >> https://einrvxz.net/?einrvxz einrvxz >> https://ikmtxz.net/?ikmtxz ikmtxz http://adkmqsz.biz/?adkmqsz >> adkmqsz http://bchnox.com/?bchnox bchnox >> https://ckptvw.com/?ckptvw ckptvw http://dgops.com/?dgops >> dgops https://bfnrtxy.info/?bfnrtxy bfnrtxy >> http://bdnpstu.info/?bdnpstu bdnpstu >> http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx >> gjnrx http://abcsvz.com/?abcsvz abcsvz >> http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz >> fkmouxz http://bcpsx.com/?bcpsx bcpsx >> http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq >> bcdepq http://kqswy.net/?kqswy kqswy >> https://bdgjnuy.ru/?bdgjnuy bdgjnuy >> https://cgiklsvz.info/?cgiklsvz cgiklsvz >> https://bfgnprwy.net/?bfgnprwy bfgnprwy >> https://bejmnqrw.com/?bejmnqrw bejmnqrw >> http://gilopqwy.ua/?gilopqwy gilopqwy >> https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy >> kmrvwy https://eglotuv.net/?eglotuv eglotuv >> http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx http://ahmqv.com/? >> ?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy >> https://gjoquvwx.net/?gjoquvwx gjoquvwx >> https://abcfouw.biz/?abcfouw abcfouw >> https://abcekn.ru/?abcekn abcekn http://ijnruvz.com/?ijnruvz >> ijnruvz http://mqruv.ua/?mqruv mqruv >> https://afklmu.biz/?afklmu afklmu >> https://abcgptyz.biz/?abcgptyz abcgptyz >> https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx >> ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx >> hlnvx http://adiklmp.com/?adiklmp adiklmp >> https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw >> acmopvw http://cdmuw.ua/?cdmuw cdmuw >> http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy >> lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw >> http://bcfhmsz.net/?bcfhmsz bcfhmsz >> http://bcgknpry.biz/?bcgknpry bcgknpry >> https://bflmnoq.info/?bflmnoq bflmnoq >> https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw >> gknosuw http://ajoruvz.com/?ajoruvz ajoruvz >> http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx >> deintwx https://eghijkuy.com/?eghijkuy eghijkuy http:// >> dginwx.ua/?dginwx dginwx >> http://aenoxy.net/?aenoxy aenoxy >> https://dghlpuvx.ru/?dghlpuvx dghlpuvx >> http://cefor.info/?cefor cefor https://afioz.biz/?afioz afioz >> https://deklmuvw.com/?deklmuvw deklmuvw >> http://cfgimox.biz/?cfgimox cfgimox >> http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy >> acilsy http://bfhlnop.net/?bfhlnop bfhlnop >> https://fijluz.net/?fijluz fijluz >> https://dfjnostv.ua/?dfjnostv dfjnostv >> https://aginsty.net/?aginsty aginsty >> http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu >> agopu http://bdkopst.ua/?bdkopst bdkopst >> https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu >> cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz >> http://ijlmuvw.com/?ijlmuvw ijlmuvw >> https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy >> http://aefhty.info/?aefhty aefhty >> http://afjklsux.net/?afjklsux afjklsux >> http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx >> bdjkpx http://aciknovw.net/?aciknovw aciknovw >> https://aftuxy.ua/?aftuxy aftuxy >> https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz http://fijq >> rv.com/?fijqrv fijqrv >> https://ahmquvw.info/?ahmquvw ahmquvw >> https://ghklmqvy.com/?ghklmqvy ghklmqvy >> http://aehkty.com/?aehkty aehkty https://aclpsuz.ru/?aclpsuz >> aclpsuz https://ghoqstvy.com/?ghoqstvy ghoqstvy >> https://dghnqrty.ru/?dghnqrty dghnqrty >> https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv >> adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz >> http://bcdeghvx.ru/?bcdeghvx bcdeghvx >> http://bdfgruvy.com/?bdfgruvy bdfgruvy >> https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp >> abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz >> http://clqruwxy.com/?clqruwxy clqruwxy >> https://fhikoq.net/?fhikoq fhikoq >> https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz >> http://fhjkor.biz/?fhjkor fhjkor >> https://dfgkouyz.com/?dfgkouyz dfgkouyz >> https://acfhmqtz.info/?acfhmqtz acfhmqtz >> https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu >> dfgnrtu https://diknswx.biz/?diknswx diknswx >> https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq >> https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux fhjlt >> ?ux http://ablmv.com/?ablmv ablmv >> http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr >> cnopr http://adgopz.ru/?adgopz adgopz >> http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr >> ghmqr https://cdetv.ru/?cdetv cdetv >> https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv >> djopv http://aiknorw.biz/?aiknorw aiknorw >> https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox >> https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq >> fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv >> dfhquv https://bfgjoq.com/?bfgjoq bfgjoq >> https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv >> cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz >> https://ghijkstx.info/?ghijkstx ghijkstx >> http://abilmsz.info/?abilmsz abilmsz >> https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv >> eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx >> https://acgkmnpu.com/?acgkmnpu acgkmnpu >> http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz >> cdgkoz http://dimnosw.info/?dimnosw dimnosw http://abegn >> wx.biz/?abegnwx abegnwx >> http://abejnsx.net/?abejnsx abejnsx >> https://emnorxz.ru/?emnorxz emnorxz http://abfsz.ua/?abfsz >> abfsz https://imqrvy.net/?imqrvy imqrvy >> https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv >> http://bdfostyz.net/?bdfostyz bdfostyz >> https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw >> afgruvw https://cefivyz.com/?cefivyz cefivyz >> https://gmqrt.com/?gmqrt gmqrt >> https://acdimqrw.info/?acdimqrw acdimqrw >> http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs >> https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx >> adfjstx https://cghlqt.net/?cghlqt cghlqt >> http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv >> ahluv http://acdfgn.net/?acdfgn acdfgn >> https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy >> ghnuvy http://aeiky.com/?aeiky aeiky >> https://bdilnov.info/?bdilnov bdilnov >> https://cjlnpqru.biz/?cjlnpqru cjlnpqru >> https://cgjqtv.biz/?cgjqtv cgjqtv >> https://abcnsvz.biz/?abcnsvz abcnsvz >> http://abegopsu.biz/?abegopsu abegopsu >> https://bcmstuvy.net/?bcmstuvy >> ? bcmstuvy https://bhjknsuz.net/?bhjknsuz bhjknsuz >> http://aceklno.info/?aceklno aceklno >> http://diotvyz.ru/?diotvyz diotvyz >> https://aefkmruw.info/?aefkmruw aefkmruw >> https://cdghlmnz.info/?cdghlmnz cdghlmnz >> https://bcgijmow.net/?bcgijmow bcgijmow >> http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt >> bhjnrt https://eimoptu.ua/?eimoptu eimoptu >> https://achptv.info/?achptv achptv https://boptux.biz/?boptux >> boptux http://acmovz.ua/?acmovz acmovz >> https://abfmnpw.info/?abfmnpw abfmnpw >> https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps >> ehnps https://dnpuw.biz/?dnpuw dnpuw >> https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz >> fmuyz http://bejkvz.info/?bejkvz bejkvz >> https://dfkmnyz.ua/?dfkmnyz dfkmnyz >> http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw >> abdjsw https://abelrvx.biz/?abelrvx abelrvx >> https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz >> hmnqz https://dhkntv.ru/?dhkntv dhkntv >> http://bgqsyz.biz/?bgqsyz bgqsyz >> http://eghkmsux.net/?eghkmsux eghkms >> ?ux https://acehlrsv.info/?acehlrsv acehlrsv >> https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu >> cefmtu https://hjnqrt.net/?hjnqrt hjnqrt >> http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu >> cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr >> cdfqr http://adelmvz.ru/?adelmvz adelmvz >> http://abflnsvx.com/?abflnsvx abflnsvx >> http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux >> fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz >> http://cdflstwz.net/?cdflstwz cdflstwz >> https://egjostux.ua/?egjostux egjostux >> https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz >> bcfgmz http://dfnouz.ua/?dfnouz dfnouz >> http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw >> dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl >> dijkl http://aeglu.com/?aeglu aeglu >> http://dgikmpy.info/?dgikmpy dgikmpy >> https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu >> acfglopu https://cdjqx.net/?cdjqx cdjqx >> https://aefimst.biz/?aefimst aefimst >> http://bfikmuw.net/?bfikmuw bfik >> ?muw http://abhioqvw.ua/?abhioqvw abhioqvw >> http://abdilmo.net/?abdilmo abdilmo >> https://bhlovx.info/?bhlovx bhlovx >> https://fgvwxz.info/?fgvwxz fgvwxz >> http://afhjpqsv.biz/?afhjpqsv afhjpqsv >> http://deilrsv.com/?deilrsv deilrsv >> http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw >> bfkmtw https://adkxy.ru/?adkxy adkxy >> https://blpqxy.com/?blpqxy blpqxy >> https://efgilor.net/?efgilor efgilor >> https://abcdky.info/?abcdky abcdky >> https://befgikqu.ru/?befgikqu befgikqu >> https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu >> fjlsu https://kpquw.com/?kpquw kpquw >> https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy >> http://dijopsz.biz/?dijopsz dijopsz >> http://acfhrsz.ua/?acfhrsz acfhrsz >> https://bfhrstvy.ua/?bfhrstvy bfhrstvy >> http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw >> bcjqrvw https://hiotz.ru/?hiotz hiotz >> https://abefmswx.biz/?abefmswx abefmswx >> http://acgkopyz.ua/?acgkopyz acgkopyz >> https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz >> befmquwz https://hijory.ua/?hijory h >> ?ijory http://cdfhr.info/?cdfhr cdfhr >> https://ahiksx.ua/?ahiksx ahiksx https://aghmxy.ru/?aghmxy >> aghmxy https://hnopqt.net/?hnopqt hnopqt >> https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv >> hknpv https://abdfmotw.net/?abdfmotw abdfmotw >> http://bcfhpvx.biz/?bcfhpvx bcfhpvx >> http://beimrty.net/?beimrty beimrty >> http://dgnsvwy.ua/?dgnsvwy dgnsvwy >> https://aghjstvw.com/?aghjstvw aghjstvw >> http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop >> finop http://bcdhsy.biz/?bcdhsy bcdhsy >> http://cfhmnvy.net/?cfhmnvy cfhmnvy >> https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou >> akmou https://fgkntuw.biz/?fgkntuw fgkntuw >> https://adfjsw.com/?adfjsw adfjsw >> https://bdjmrsty.biz/?bdjmrsty bdjmrsty >> http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw >> http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz >> dmnrz http://deglqtu.com/?deglqtu deglqtu >> http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw >> https://gnoqvxz.com/?gnoqvxz gnoqvxz >> https://hijkmsu.ru/?hijkmsu hijkmsu http://cist >> xyz.biz/?cistxyz cistxyz >> https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy >> hmqrwy https://crsuv.ru/?crsuv crsuv >> http://abejkmoz.ua/?abejkmoz abejkmoz >> http://abjpqu.ua/?abjpqu abjpqu >> https://ehjkopuz.info/?ehjkopuz ehjkopuz >> http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx >> emntvwx http://eghksx.info/?eghksx eghksx >> https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv >> cinopuv http://efijnptz.biz/?efijnptz efijnptz >> http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw >> denprw http://adglpwz.net/?adglpwz adglpwz >> http://dehimosu.biz/?dehimosu dehimosu >> https://abchprwy.com/?abchprwy abchprwy >> http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz >> cehnwyz http://aghlopy.ua/?aghlopy aghlopy >> http://fhjksuvz.com/?fhjksuvz fhjksuvz >> http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw >> https://emnprwyz.info/?emnprwyz emnprwyz >> http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv >> cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst https://lmquv >> ?.ru/?lmquv lmquv http://degiptv.info/?degiptv degiptv >> http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx >> gklmvx http://behlpstz.com/?behlpstz behlpstz >> https://efginz.com/?efginz efginz >> https://deilmnru.biz/?deilmnru deilmnru >> http://bdehoqux.ru/?bdehoqux bdehoqux >> http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw >> https://cfmtvy.net/?cfmtvy cfmtvy >> http://agkmqvxy.biz/?agkmqvxy agkmqvxy >> https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz >> ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz >> http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx >> http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy >> bgimy https://acdfhimy.info/?acdfhimy acdfhimy >> http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux >> lnopux https://iknqstux.ua/?iknqstux iknqstux >> https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv >> abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz >> http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz >> cfgnvz https://jtuvx.com/?jtuvx jtuvx ht >> ?tps://fgikl.ru/?fgikl fgikl >> http://ehikrs.ru/?ehikrs ehikrs http://beghqx.ru/?beghqx >> beghqx http://dhijlrsw.com/?dhijlrsw dhijlrsw >> http://elqst.ru/?elqst elqst https://abfklmns.biz/?abfklmns >> abfklmns https://cilnorw.ua/?cilnorw cilnorw >> https://anoqu.ua/?anoqu anoqu http://adikquz.com/?adikquz >> adikquz http://abcpwyz.ua/?abcpwyz abcpwyz >> https://fiqwxz.ru/?fiqwxz fiqwxz http://bcivw.ru/?bcivw bcivw >> https://acdfnrz.ua/?acdfnrz acdfnrz >> https://jmopqtw.info/?jmopqtw jmopqtw >> https://abdeghtv.biz/?abdeghtv abdeghtv >> http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy >> forvy https://abgjknpv.com/?abgjknpv abgjknpv >> http://bcdgimr.info/?bcdgimr bcdgimr >> https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx >> bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw >> http://ekmouvwy.biz/?ekmouvwy ekmouvwy >> http://cijpqsxz.biz/?cijpqsxz cijpqsxz >> http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow >> cinow http://cdejmrs.info/?cdejmrs cdejmrs >> http://fgknprwz.info/?fgknprwz fgknprwz https >> ?://binpquxy.com/?binpquxy >> binpquxy https://afhqrtu.biz/?afhqrtu afhqrtu >> http://cemqtv.ru/?cemqtv cemqtv http://bgnqsz.info/?bgnqsz >> bgnqsz http://cdfjprsv.ua/?cdfjprsv cdfjprsv >> https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw >> adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw >> ehkmnw http://kmsxy.ua/?kmsxy kmsxy >> http://benotvw.info/?benotvw benotvw >> https://dhinops.biz/?dhinops dhinops >> https://hklnpqtv.ru/?hklnpqtv hklnpqtv >> https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx >> https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp >> cdghjp http://abchnoqx.com/?abchnoqx abchnoqx >> http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy >> acdnqwy http://befilpv.com/?befilpv befilpv >> http://fijrux.biz/?fijrux fijrux >> https://aknqrtuz.ru/?aknqrtuz aknqrtuz >> http://abfoqwz.ru/?abfoqwz abfoqwz >> https://defikv.info/?defikv defikv >> http://abcoqtz.com/?abcoqtz abcoqtz >> https://dfirxz.net/?dfirxz dfirxz >> http://bdfipqux.net/?bdfipqux bdfipqux >> http://bcgqx.net/?bcgqx bcgqx https://bdf >> hl.ua/?bdfhl bdfhl >> http://ahikmpqt.info/?ahikmpqt ahikmpqt >> https://aghqrt.com/?aghqrt aghqrt https://afhosxz.ru/?afhosxz >> afhosxz https://ehsvx.ru/?ehsvx ehsvx >> http://gknquvw.info/?gknquvw gknquvw >> https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy >> fhopy https://bcmpux.net/?bcmpux bcmpux >> https://adfmrsuy.info/?adfmrsuy adfmrsuy >> https://fmtuvx.net/?fmtuvx fmtuvx >> https://defjmswy.net/?defjmswy defjmswy >> https://ahijmrsz.info/?ahijmrsz ahijmrsz >> https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz >> https://behioqyz.ru/?behioqyz behioqyz >> https://admrsvxz.com/?admrsvxz admrsvxz >> https://cefhipqw.com/?cefhipqw cefhipqw >> http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz >> acfgksz http://hjlmqst.info/?hjlmqst hjlmqst >> http://efgnrsux.net/?efgnrsux efgnrsux >> http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu >> acopu https://chilnqrs.ru/?chilnqrs chilnqrs >> https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy >> gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemu >> ?vwz http://bdfqwx.net/?bdfqwx bdfqwx >> http://dfghilr.net/?dfghilr dfghilr >> http://bdlqstv.com/?bdlqstv bdlqstv >> http://adhjnrsz.com/?adhjnrsz adhjnrsz >> http://adiopqx.com/?adiopqx adiopqx >> http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx >> dgpsx http://afgrt.ru/?afgrt afgrt >> http://bghipuvz.ua/?bghipuvz bghipuvz >> http://egjmvw.com/?egjmvw egjmvw >> https://eginoswz.com/?eginoswz eginoswz >> http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno >> http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz >> aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy >> kruwy http://fhlqty.net/?fhlqty fhlqty >> http://chpuvyz.ru/?chpuvyz chpuvyz >> https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw >> cklsw https://fghkmvy.info/?fghkmvy fghkmvy >> http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz >> eimpxz http://cdjstuz.info/?cdjstuz cdjstuz >> https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty >> alnrty https://hkmntz.info/?hkmntz hkmntz >> https://dkpqr.com/?dkpqr dkpqr ht >> ?tps://bfntv.ru/?bfntv bfntv >> https://inuvwyz.ru/?inuvwyz inuvwyz >> https://jkmrvyz.ua/?jkmrvyz jkmrvyz >> http://bfipruw.biz/?bfipruw bfipruw http://cjklos.ua/?cjklos >> cjklos http://afglru.ru/?afglru afglru http://defrt.ua/?defrt >> defrt https://afghsvy.ua/?afghsvy afghsvy >> http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz >> mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz >> http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv >> dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz >> aijvxz https://abdhrsx.net/?abdhrsx abdhrsx >> http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow >> fjkow https://fkruvx.biz/?fkruvx fkruvx >> http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops >> djkmops https://cequyz.ru/?cequyz cequyz >> https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu >> http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy >> adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps >> http://dhmostwy.net/?dhmostwy dhmostwy >> https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.u >> ?a/?bilnor bilnor https://fnqxz.com/?fnqxz fnqxz >> https://bijnuwz.com/?bijnuwz bijnuwz >> http://abcgikwy.ua/?abcgikwy abcgikwy >> https://begimrv.ua/?begimrv begimrv https://jlprv.net/?jlprv >> jlprv http://efhsz.ua/?efhsz efhsz https://dgiqs.ua/?dgiqs >> dgiqs http://bgiprx.net/?bgiprx bgiprx >> https://cdklmqv.biz/?cdklmqv cdklmqv >> https://adgnowz.ua/?adgnowz adgnowz >> http://cfjnorty.ru/?cfjnorty cfjnorty >> https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy >> http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu >> https://fgjoquvz.ru/?fgjoquvz fgjoquvz >> http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty >> dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy >> https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry >> cdikry https://bfhkrty.info/?bfhkrty bfhkrty >> https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx >> ceknpx https://gjpwy.com/?gjpwy gjpwy >> https://ceouwxyz.ru/?ceouwxyz ceouwxyz >> https://ijkopstu.ua/?ijkopstu ijkopstu >> https://acegjlmo.net/?acegjlmo acegjlmo >> http://bflnpv.info/?bflnpv bflnpv h >> ?ttp://fijkotz.ua/?fijkotz >> fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz >> http://adjpvw.net/?adjpvw adjpvw https://dpqrsv.biz/?dpqrsv >> dpqrsv https://acehklps.ua/?acehklps acehklps >> http://aghsuvyz.biz/?aghsuvyz aghsuvyz >> https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt >> http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx >> fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz >> abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx >> https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu >> cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz >> https://demnpst.biz/?demnpst demnpst >> https://bghikms.biz/?bghikms bghikms >> http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz >> npqrwz http://adehilpq.net/?adehilpq adehilpq >> http://bnortwy.com/?bnortwy bnortwy >> http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz >> aksyz http://egkmosux.biz/?egkmosux egkmosux >> https://dimoruy.com/?dimoruy dimoruy >> https://ehopyz.net/?ehopyz ehopyz >> https://ehijptuy.com/?ehijptuy ehijptuy http://jk >> lmprtv.ua/?jklmprtv jklmprtv >> http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu >> abcfqu https://degijyz.ru/?degijyz degijyz >> https://cikortuy.net/?cikortuy cikortuy >> http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt >> bfgknpqt https://afmqtv.ru/?afmqtv afmqtv >> https://bertuvwz.info/?bertuvwz bertuvwz >> https://eipuvwxy.com/?eipuvwxy eipuvwxy >> http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx >> acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs >> cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz >> bktxz http://adegmovy.biz/?adegmovy adegmovy >> http://fkxyz.ua/?fkxyz fkxyz http://bdikny.biz/?bdikny bdikny >> https://cistv.biz/?cistv cistv https://behknqtz.biz/?behknqtz >> behknqtz http://efhjquwx.biz/?efhjquwx efhjquwx >> https://djmnosz.com/?djmnosz djmnosz http://adino.com/?adino >> adino http://djlmouv.ua/?djlmouv djlmouv >> http://dehrvy.biz/?dehrvy dehrvy http://cefijlu.com/?cefijlu >> cefijlu http://bfgitz.ua/?bfgitz bfgitz >> http://ilmqsz.ua/?ilmqsz ilmqsz h >> ?ttp://aejnrwy.ru/?aejnrwy >> aejnrwy https://lnuwz.ru/?lnuwz lnuwz http://jlnrv.com/?jlnrv >> jlnrv https://bjstxz.ru/?bjstxz bjstxz >> https://fhjnw.com/?fhjnw fhjnw https://ajmnuz.com/?ajmnuz >> ajmnuz https://ghptxy.net/?ghptxy ghptxy >> https://aceirsxy.ua/?aceirsxy aceirsxy >> http://bcemosuw.info/?bcemosuw bcemosuw >> https://cgklpxy.ua/?cgklpxy cgklpxy >> https://deghijrs.net/?deghijrs deghijrs >> http://cgjknvz.ru/?cgjknvz cgjknvz >> https://bcikmswz.net/?bcikmswz bcikmswz >> https://cginosuy.com/?cginosuy cginosuy >> http://bestz.ua/?bestz bestz https://dkmpry.net/?dkmpry >> dkmpry https://bfmpyz.ru/?bfmpyz bfmpyz >> https://dfglmp.ru/?dfglmp dfglmp >> https://cfgmnpru.ua/?cfgmnpru cfgmnpru >> https://cefhlns.ru/?cefhlns cefhlns >> http://bcelnosz.ua/?bcelnosz bcelnosz >> https://cehnsy.net/?cehnsy cehnsy https://ipuxyz.info/?ipuxyz >> ipuxyz https://dfghintv.net/?dfghintv dfghintv >> https://abdhkuxy.com/?abdhkuxy abdhkuxy >> https://abrsu.ua/?abrsu abrsu http://fimrstw.ru/?fimrstw >> fimrstw https://bhkmpvz.ua/?bhkmpvz bhkmpvz h >> ?ttp://cefkqrx.info/?cefkqrx >> cefkqrx http://bdjkstvz.biz/?bdjkstvz bdjkstvz >> https://abegks.ua/?abegks abegks https://cfhostz.ru/?cfhostz >> cfhostz http://jnqsv.ua/?jnqsv jnqsv >> http://abcfjmps.net/?abcfjmps abcfjmps >> https://jlnptu.com/?jlnptu jlnptu >> https://bdghklns.ua/?bdghklns bdghklns >> https://bdikpu.ua/?bdikpu bdikpu http://hklpqs.ru/?hklpqs >> hklpqs https://bgjmpsy.ua/?bgjmpsy bgjmpsy >> http://cemnsvz.net/?cemnsvz cemnsvz >> http://cfhknu.info/?cfhknu cfhknu >> http://bcjmqsuz.ru/?bcjmqsuz bcjmqsuz >> http://fghjloqu.com/?fghjloqu fghjloqu >> https://lmryz.net/?lmryz lmryz http://imvwyz.net/?imvwyz >> imvwyz http://dhklntw.com/?dhklntw dhklntw >> http://adijnouy.info/?adijnouy adijnouy >> https://cdervx.info/?cdervx cdervx >> https://ajkqrtuw.ru/?ajkqrtuw ajkqrtuw >> http://diknqtu.biz/?diknqtu diknqtu http://hjnrux.ru/?hjnrux >> hjnrux https://ilmouxy.info/?ilmouxy ilmouxy >> http://adfijm.ua/?adfijm adfijm http://efgptx.ru/?efgptx >> efgptx https://dmprtvwy.com/?dmprtvwy dmprtvwy >> http://bgksuv.ua/?bgksuv bgksuv >> https://cfhsv.com/?cfhsv cfhsv http://adgoqr.biz/?adgoqr >> adgoqr http://efilmx.net/?efilmx efilmx >> http://eglpqrx.ua/?eglpqrx eglpqrx >> http://acequxz.com/?acequxz acequxz http://lmsvz.ua/?lmsvz >> lmsvz http://aefhprtw.com/?aefhprtw aefhprtw >> https://bglnq.info/?bglnq bglnq http://befhirv.ua/?befhirv >> befhirv https://achkryz.com/?achkryz achkryz >> https://bmpsvy.info/?bmpsvy bmpsvy http://ejloqy.net/?ejloqy >> ejloqy https://biopqz.net/?biopqz biopqz >> https://bfhklotx.biz/?bfhklotx bfhklotx >> https://dgjkqwy.ua/?dgjkqwy dgjkqwy >> https://afnpqrxz.net/?afnpqrxz afnpqrxz >> https://hjnpqrxz.com/?hjnpqrxz hjnpqrxz >> https://cimpuyz.net/?cimpuyz cimpuyz http://cruwz.biz/?cruwz >> cruwz https://bknovz.net/?bknovz bknovz >> http://dhilrtv.ua/?dhilrtv dhilrtv >> http://chjpqvyz.info/?chjpqvyz chjpqvyz >> https://achijmr.com/?achijmr achijmr >> https://chknosty.ru/?chknosty chknosty >> https://acgqv.com/?acgqv acgqv http://defklosy.info/?defklosy >> defklosy http://kmqrw.com/?kmqrw kmqrw >> https://npqtvz.ru/?npqtvz npqtvz http: >> ?//cefkrux.ua/?cefkrux cefkrux >> https://aefhivx.ua/?aefhivx aefhivx >> https://dijlrtuy.ua/?dijlrtuy dijlrtuy >> https://aoqtw.com/?aoqtw aoqtw https://cehisuwz.net/?cehisuwz >> cehisuwz http://bfghijmr.biz/?bfghijmr bfghijmr >> http://beimquxy.ua/?beimquxy beimquxy >> http://dmsuw.info/?dmsuw dmsuw http://imnty.net/?imnty imnty >> https://detwz.net/?detwz detwz http://glnvx.net/?glnvx glnvx >> https://himnopw.ru/?himnopw himnopw https://aklnw.com/?aklnw >> aklnw http://bjnrwx.net/?bjnrwx bjnrwx >> http://defjlqvy.com/?defjlqvy defjlqvy >> http://dfimopvz.ua/?dfimopvz dfimopvz >> http://hmoquv.ru/?hmoquv hmoquv http://afkmq.net/?afkmq afkmq >> https://blnrwyz.info/?blnrwyz blnrwyz >> http://fklnrtvx.info/?fklnrtvx fklnrtvx >> https://fghnruz.biz/?fghnruz fghnruz >> https://bijqsu.ru/?bijqsu bijqsu http://ghoruxyz.ru/?ghoruxyz >> ghoruxyz http://acejknry.ru/?acejknry acejknry >> http://cehquvxy.net/?cehquvxy cehquvxy >> https://bfgkmoqs.com/?bfgkmoqs bfgkmoqs >> https://bchjqx.net/?bchjqx bchjqx http://bmoqux.biz/?bmoqux >> bmoqux https >> ?://aijklm.com/?aijklm aijklm >> https://clmptx.info/?clmptx clmptx >> https://dghmpqtz.net/?dghmpqtz dghmpqtz >> https://bfkrt.net/?bfkrt bfkrt http://egkqrtuw.biz/?egkqrtuw >> egkqrtuw http://hjlstvx.com/?hjlstvx hjlstvx >> http://jkoprv.com/?jkoprv jkoprv https://chnpstv.com/?chnpstv >> chnpstv https://eknptuw.ru/?eknptuw eknptuw >> https://jkmrst.net/?jkmrst jkmrst >> http://bfmnqrvz.net/?bfmnqrvz bfmnqrvz >> http://bcdhnrwy.ua/?bcdhnrwy bcdhnrwy >> https://bcdfgkty.ru/?bcdfgkty bcdfgkty >> http://cmpqv.info/?cmpqv cmpqv https://dhilsz.ua/?dhilsz >> dhilsz https://acekpsy.ru/?acekpsy acekpsy >> http://gmnpswx.biz/?gmnpswx gmnpswx https://cdinpr.ua/?cdinpr >> cdinpr http://cdhjnorz.info/?cdhjnorz cdhjnorz >> https://behjk.com/?behjk behjk https://lmpstv.com/?lmpstv >> lmpstv https://ilmory.ua/?ilmory ilmory >> https://abiortz.ua/?abiortz abiortz http://bceotz.biz/?bceotz >> bceotz https://ghjlqt.info/?ghjlqt ghjlqt >> https://hmosyz.ua/?hmosyz hmosyz >> https://behmoqru.net/?behmoqru behmoqru >> https://chinpx.info/?chinpx chinpx http >> ?s://fkntuvyz.info/?fkntuvyz >> fkntuvyz http://cfghimtw.ua/?cfghimtw cfghimtw >> https://fqstwz.com/?fqstwz fqstwz http://gimvw.biz/?gimvw >> gimvw https://degixy.ua/?degixy degixy >> http://fnvwy.com/?fnvwy fnvwy https://fikmstyz.ru/?fikmstyz >> fikmstyz http://chmopuwy.info/?chmopuwy chmopuwy >> http://cdfovy.ua/?cdfovy cdfovy http://acdfhjkr.ua/?acdfhjkr >> acdfhjkr https://bcfktw.biz/?bcfktw bcfktw >> https://fkmoqux.biz/?fkmoqux fkmoqux >> http://adghjox.ru/?adghjox adghjox >> https://dlnortv.info/?dlnortv dlnortv >> http://ijlpsuw.ru/?ijlpsuw ijlpsuw >> https://cgijqruw.info/?cgijqruw cgijqruw >> https://eisxy.ru/?eisxy eisxy http://dhjkmqvw.net/?dhjkmqvw >> dhjkmqvw https://bfgklnu.info/?bfgklnu bfgklnu >> http://bcfil.info/?bcfil bcfil https://degitv.com/?degitv >> degitv https://ghjlmr.net/?ghjlmr ghjlmr >> https://ceilv.ru/?ceilv ceilv http://gjknrtw.ua/?gjknrtw >> gjknrtw http://fotvwyz.net/?fotvwyz fotvwyz >> http://dgklquv.ua/?dgklquv dgklquv >> http://cghijnv.info/?cghijnv cghijnv >> https://aelmoy.com/?aelmoy aelmoy http: >> ?//begilmnx.com/?begilmnx >> begilmnx http://bcegrtxz.ua/?bcegrtxz bcegrtxz >> https://bhkqs.com/?bhkqs bhkqs https://cefgpstu.ua/?cefgpstu >> cefgpstu http://gmntv.com/?gmntv gmntv >> http://afjnstuy.ua/?afjnstuy afjnstuy >> https://aegqwy.biz/?aegqwy aegqwy http://ioswz.biz/?ioswz >> ioswz https://gorvw.info/?gorvw gorvw >> http://cgnouwy.ua/?cgnouwy cgnouwy >> http://bchkpvy.info/?bchkpvy bchkpvy http://egijl.ua/?egijl >> egijl https://gorstuxz.net/?gorstuxz gorstuxz >> https://cdhjnopu.biz/?cdhjnopu cdhjnopu >> http://bceijoy.ru/?bceijoy bceijoy >> https://abfgjqw.ua/?abfgjqw abfgjqw http://bdeltw.net/?bdeltw >> bdeltw http://jktwz.net/?jktwz jktwz >> https://bgikloqy.info/?bgikloqy bgikloqy >> https://eimqrsuw.biz/?eimqrsuw eimqrsuw >> http://cfgkpu.net/?cfgkpu cfgkpu http://efmqrtv.com/?efmqrtv >> efmqrtv http://cegikmru.info/?cegikmru cegikmru >> http://hiprsv.ua/?hiprsv hiprsv https://efiorw.com/?efiorw >> efiorw https://ehkpy.ru/?ehkpy ehkpy >> http://acdjps.biz/?acdjps acdjps >> http://lmnopvxy.net/?lmnopvxy lmnopvxy http://abe >> ptuy.net/?abeptuy abeptuy >> http://fimnoq.ua/?fimnoq fimnoq http://adefky.com/?adefky >> adefky https://cdfijlrz.ru/?cdfijlrz cdfijlrz >> https://dimnv.ru/?dimnv dimnv http://bcehz.com/?bcehz bcehz >> http://ahkpruvw.ru/?ahkpruvw ahkpruvw >> https://afnrw.com/?afnrw afnrw https://bclnpqtv.biz/?bclnpqtv >> bclnpqtv https://hmptuy.ua/?hmptuy hmptuy >> https://adfkx.com/?adfkx adfkx https://cdfgikp.com/?cdfgikp >> cdfgikp https://hjovwxz.net/?hjovwxz hjovwxz >> http://enoprsxz.ru/?enoprsxz enoprsxz >> https://dmryz.info/?dmryz dmryz http://ijopuwx.info/?ijopuwx >> ijopuwx https://dghoryz.ua/?dghoryz dghoryz >> https://gnpqsvy.biz/?gnpqsvy gnpqsvy >> https://jlmtwxy.ua/?jlmtwxy jlmtwxy >> http://clmpuvz.com/?clmpuvz clmpuvz https://gprtvx.ua/?gprtvx >> gprtvx http://bklptuxz.net/?bklptuxz bklptuxz >> https://bfknrvz.com/?bfknrvz bfknrvz http://dgkqrv.ua/?dgkqrv >> dgkqrv https://adkprstv.net/?adkprstv adkprstv >> https://afiqx.com/?afiqx afiqx https://bchiknpt.com/?bchiknpt >> bchiknpt http://hjklprx.biz/?hjklprx hjklprx http://bejor >> uw.net/?bejoruw bejoruw >> http://cegmnvz.biz/?cegmnvz cegmnvz >> http://gmnotyz.ua/?gmnotyz gmnotyz >> https://dfhikntw.net/?dfhikntw dfhikntw >> https://dfouv.biz/?dfouv dfouv https://cjlnotz.info/?cjlnotz >> cjlnotz https://denqtvwz.ua/?denqtvwz denqtvwz >> http://behlrwy.com/?behlrwy behlrwy >> http://drsuvxz.biz/?drsuvxz drsuvxz >> https://esuwyz.biz/?esuwyz esuwyz http://fhjntu.biz/?fhjntu >> fhjntu https://gqrswyz.biz/?gqrswyz gqrswyz >> https://mpswxz.biz/?mpswxz mpswxz http://dhjpsuv.net/?dhjpsuv >> dhjpsuv http://dfopruy.ru/?dfopruy dfopruy >> https://ehkloqsz.info/?ehkloqsz ehkloqsz >> https://adgjqru.biz/?adgjqru adgjqru >> https://ejmtvwz.biz/?ejmtvwz ejmtvwz >> http://fgoquw.net/?fgoquw fgoquw https://fhikqry.com/?fhikqry >> fhikqry https://cemqy.com/?cemqy cemqy >> http://otuxz.com/?otuxz otuxz http://fijlmnoq.info/?fijlmnoq >> fijlmnoq http://cgiouw.ru/?cgiouw cgiouw >> https://cdgijlnw.biz/?cdgijlnw cdgijlnw >> https://gmuvw.biz/?gmuvw gmuvw >> https://abdefgiu.info/?abdefgiu abdefgiu >> https://gkmqwy.info/?gkmqwy gkmqwy >> http://bfgkqstw.com/?bfgkqstw bfgkqstw >> https://bfgmnovx.net/?bfgmnovx bfgmnovx >> https://bkmnrsy.com/?bkmnrsy bkmnrsy http://chiuwx.ua/?chiuwx >> chiuwx http://chlvw.ua/?chlvw chlvw >> https://acfgilp.ru/?acfgilp acfgilp http://bghosv.com/?bghosv >> bghosv https://befgkpst.net/?befgkpst befgkpst >> http://jklmnwz.net/?jklmnwz jklmnwz https://dktxz.com/?dktxz >> dktxz http://aeknuwyz.ru/?aeknuwyz aeknuwyz >> https://aqtuv.ua/?aqtuv aqtuv http://bfjorw.info/?bfjorw >> bfjorw http://cfijuz.biz/?cfijuz cfijuz >> https://efjlnuv.net/?efjlnuv efjlnuv >> https://hkruvy.net/?hkruvy hkruvy >> http://defnswxy.biz/?defnswxy defnswxy >> https://fgjmqxy.biz/?fgjmqxy fgjmqxy >> https://deflmtu.ru/?deflmtu deflmtu https://cgmnr.info/?cgmnr >> cgmnr https://dgilnsvy.ua/?dgilnsvy dgilnsvy >> https://chjmosxy.biz/?chjmosxy chjmosxy >> https://abcdhln.net/?abcdhln abcdhln >> https://elstuyz.biz/?elstuyz elstuyz >> https://bjoptw.com/?bjoptw bjoptw https://bcdgquv.ru/?bcdgquv >> bcdgquv http://afiklst.com/?afiklst afiklst >> https://agnvwz.info/?agnvwz >> ? agnvwz http://bcfgkrvx.biz/?bcfgkrvx bcfgkrvx >> https://bdjmnuz.biz/?bdjmnuz bdjmnuz >> https://emqrwz.com/?emqrwz emqrwz https://egklqt.biz/?egklqt >> egklqt http://bimqr.com/?bimqr bimqr >> https://dhklnsv.net/?dhklnsv dhklnsv https://dvwxz.com/?dvwxz >> dvwxz https://dfghimu.com/?dfghimu dfghimu >> https://fostx.ru/?fostx fostx https://fjklnqux.ua/?fjklnqux >> fjklnqux https://cfhijnqu.info/?cfhijnqu cfhijnqu >> http://nptvy.ua/?nptvy nptvy http://bclnu.com/?bclnu bclnu >> http://eflntvyz.ru/?eflntvyz eflntvyz >> https://abcdpsx.ru/?abcdpsx abcdpsx >> https://cfipqx.info/?cfipqx cfipqx >> http://adforst.biz/?adforst adforst >> https://bfgmpqtv.biz/?bfgmpqtv bfgmpqtv >> http://cfkmnorv.ua/?cfkmnorv cfkmnorv >> http://dhjkuv.info/?dhjkuv dhjkuv >> https://bcejlsyz.info/?bcejlsyz bcejlsyz >> https://efinrsy.ua/?efinrsy efinrsy https://kmosw.net/?kmosw >> kmosw http://jmoxy.net/?jmoxy jmoxy >> http://agjloqy.biz/?agjloqy agjloqy >> https://dglmnpqy.com/?dglmnpqy dglmnpqy >> https://dijnqrsw.ua/?dijnqrsw dijnqrsw http://ijmnr.biz/?ijmnr >> ? ijmnr https://gjouwxy.ru/?gjouwxy gjouwxy >> https://klpuz.biz/?klpuz klpuz http://hijoux.net/?hijoux >> hijoux http://bikmn.net/?bikmn bikmn >> http://abcovwy.ua/?abcovwy abcovwy >> https://adfgioqx.ua/?adfgioqx adfgioqx >> http://beijknpq.info/?beijknpq beijknpq >> https://equvx.com/?equvx equvx https://cgmpr.net/?cgmpr cgmpr >> http://dfknor.com/?dfknor dfknor https://gopsvyz.ru/?gopsvyz >> gopsvyz http://chijkqsu.net/?chijkqsu chijkqsu >> https://bfmtx.biz/?bfmtx bfmtx http://cgijqrux.ua/?cgijqrux >> cgijqrux https://fklnovy.biz/?fklnovy fklnovy >> http://chkmosux.net/?chkmosux chkmosux >> https://cenprvw.info/?cenprvw cenprvw >> https://cfuxyz.ua/?cfuxyz cfuxyz https://bdgopy.ua/?bdgopy >> bdgopy https://ampwx.info/?ampwx ampwx >> http://cefmoqtx.info/?cefmoqtx cefmoqtx >> https://bcdflny.ua/?bcdflny bcdflny https://abqtux.ua/?abqtux >> abqtux https://bpstw.com/?bpstw bpstw >> http://cegmoqry.info/?cegmoqry cegmoqry >> http://aeghjlw.net/?aeghjlw aeghjlw https://egjswz.ru/?egjswz >> egjswz https://agiluv.info/?agiluv agiluv htt >> ?ps://befimsu.biz/?befimsu >> befimsu https://abcdeiry.info/?abcdeiry abcdeiry >> http://bcqrxz.info/?bcqrxz bcqrxz >> https://eilntvxz.net/?eilntvxz eilntvxz >> https://abcjpt.ua/?abcjpt abcjpt https://adglox.ru/?adglox >> adglox https://adjnptw.ru/?adjnptw adjnptw >> http://aglrxz.biz/?aglrxz aglrxz https://abeqru.info/?abeqru >> abeqru https://cnoprsw.biz/?cnoprsw cnoprsw >> http://akoqx.com/?akoqx akoqx https://ijlmnptz.net/?ijlmnptz >> ijlmnptz https://ghimnuyz.com/?ghimnuyz ghimnuyz >> https://ghprs.ua/?ghprs ghprs https://lmntwxz.biz/?lmntwxz >> lmntwxz https://blopqxz.ua/?blopqxz blopqxz >> https://cqrxz.ua/?cqrxz cqrxz http://befmvwz.net/?befmvwz >> befmvwz https://abjlmnpr.com/?abjlmnpr abjlmnpr >> https://filotz.net/?filotz filotz http://hinsw.biz/?hinsw >> hinsw http://abfjklsy.net/?abfjklsy abfjklsy >> https://bhimnrwx.biz/?bhimnrwx bhimnrwx >> http://dejnotz.ru/?dejnotz dejnotz >> https://bcflsxz.info/?bcflsxz bcflsxz >> http://hklmsux.net/?hklmsux hklmsux >> https://ahknouz.ru/?ahknouz ahknouz https://cfknxy.ru/?cfknxy >> ?cfknxy http://acfntvx.net/?acfntvx acfntvx >> https://cdfksuvy.info/?cdfksuvy cdfksuvy >> https://abjkorx.ru/?abjkorx abjkorx >> https://afiklqvw.com/?afiklqvw afiklqvw >> http://adnovx.biz/?adnovx adnovx >> https://fglstyz.info/?fglstyz fglstyz >> https://ghikruy.biz/?ghikruy ghikruy >> http://ghnopsvz.biz/?ghnopsvz ghnopsvz >> http://abcgjnrw.com/?abcgjnrw abcgjnrw >> https://bijlnyz.ua/?bijlnyz bijlnyz http://ahiqrv.ru/?ahiqrv >> ahiqrv https://ajotw.biz/?ajotw ajotw >> http://bfhipuw.info/?bfhipuw bfhipuw >> http://acjmqry.ua/?acjmqry acjmqry >> http://efjnoqrz.info/?efjnoqrz efjnoqrz >> https://achkqst.info/?achkqst achkqst >> http://bcfpruyz.ua/?bcfpruyz bcfpruyz http://bcelv.com/?bcelv >> bcelv https://dfkoqsu.ua/?dfkoqsu dfkoqsu >> http://ekopq.ru/?ekopq ekopq https://abfkrsz.ua/?abfkrsz >> abfkrsz https://abenxy.net/?abenxy abenxy >> http://agiotu.biz/?agiotu agiotu https://bkpqr.info/?bkpqr >> bkpqr http://ahijq.ua/?ahijq ahijq http://eimps.ru/?eimps >> eimps https://abdstwz.net/?abdstwz abdstwz >> https://cdhjlu.biz/?cdhjlu cdhj >> ?lu http://gmrstyz.info/?gmrstyz gmrstyz >> https://djsvwyz.ua/?djsvwyz djsvwyz >> http://begjmorv.net/?begjmorv begjmorv >> http://ahlmoptz.biz/?ahlmoptz ahlmoptz >> http://dekoqx.ru/?dekoqx dekoqx http://fiprs.net/?fiprs fiprs >> http://cjklmr.biz/?cjklmr cjklmr >> https://aefgpquv.biz/?aefgpquv aefgpquv >> https://jlmopswx.net/?jlmopswx jlmopswx >> https://bfhjlsw.biz/?bfhjlsw bfhjlsw >> https://iknortux.com/?iknortux iknortux >> http://abhkoprv.biz/?abhkoprv abhkoprv >> https://cjnrwyz.ua/?cjnrwyz cjnrwyz >> http://djmnrvwz.info/?djmnrvwz djmnrvwz >> http://adgkrtyz.info/?adgkrtyz adgkrtyz >> https://bdhjsuv.ua/?bdhjsuv bdhjsuv >> http://acfgjor.biz/?acfgjor acfgjor >> http://cfilrtw.info/?cfilrtw cfilrtw https://fhijz.com/?fhijz >> fhijz http://hkmntxy.ru/?hkmntxy hkmntxy >> http://fhilsuy.ua/?fhilsuy fhilsuy http://jmnpwx.ru/?jmnpwx >> jmnpwx http://gikltvwy.ru/?gikltvwy gikltvwy >> http://jopqy.biz/?jopqy jopqy http://bdlqry.info/?bdlqry >> bdlqry https://ejlvwz.com/?ejlvwz ejlvwz >> http://ceijvz.net/?ceijvz ceijvz http://cfijntyz.i >> ?nfo/?cfijntyz cfijntyz http://efilotuv.com/?efilotuv >> efilotuv http://cfmpyz.com/?cfmpyz cfmpyz >> http://dhpqz.com/?dhpqz dhpqz http://dlpxz.com/?dlpxz dlpxz >> http://aefjlyz.net/?aefjlyz aefjlyz >> https://ceijorz.info/?ceijorz ceijorz >> http://dkmnuvx.info/?dkmnuvx dkmnuvx >> https://kprswx.net/?kprswx kprswx https://dgjpwz.ru/?dgjpwz >> dgjpwz https://abfjlnt.ru/?abfjlnt abfjlnt >> http://efgips.com/?efgips efgips http://cehnpu.ru/?cehnpu >> cehnpu https://fstuw.net/?fstuw fstuw http://dmorz.net/?dmorz >> dmorz https://celmrwz.info/?celmrwz celmrwz >> https://jprvx.com/?jprvx jprvx https://cefjkxz.biz/?cefjkxz >> cefjkxz http://behjlv.biz/?behjlv behjlv >> http://bdiku.net/?bdiku bdiku https://ghnpxz.ru/?ghnpxz >> ghnpxz https://bcpqvwz.ua/?bcpqvwz bcpqvwz >> https://gijlnqrs.biz/?gijlnqrs gijlnqrs >> http://adfjlrsz.info/?adfjlrsz adfjlrsz >> https://afiptvz.biz/?afiptvz afiptvz >> http://acikors.com/?acikors acikors https://dfmpq.ru/?dfmpq >> dfmpq http://abhjmsyz.ru/?abhjmsyz abhjmsyz >> https://dejkq.com/?dejkq dejkq http >> ?://abcltx.com/?abcltx abcltx >> https://bdfjopu.net/?bdfjopu bdfjopu http://bipvy.com/?bipvy >> bipvy http://dlqsx.biz/?dlqsx dlqsx >> http://inqrstv.info/?inqrstv inqrstv https://dknwz.ru/?dknwz >> dknwz http://cdgimqsu.ru/?cdgimqsu cdgimqsu >> https://akmotuv.info/?akmotuv akmotuv >> http://kmntvyz.ru/?kmntvyz kmntvyz https://bimxz.net/?bimxz >> bimxz http://afjoqyz.ru/?afjoqyz afjoqyz >> https://eikuwy.ua/?eikuwy eikuwy https://bmptw.biz/?bmptw >> bmptw http://fgltwx.com/?fgltwx fgltwx >> http://chlnpqrs.com/?chlnpqrs chlnpqrs >> https://abhux.net/?abhux abhux http://adfhmw.info/?adfhmw >> adfhmw http://bgilt.com/?bgilt bgilt >> https://cefotu.biz/?cefotu cefotu >> https://adfruvwy.biz/?adfruvwy adfruvwy >> https://fgoqstu.info/?fgoqstu fgoqstu >> https://cdilnqvy.info/?cdilnqvy cdilnqvy >> https://adghruw.ua/?adghruw adghruw >> https://ehlnrsuy.biz/?ehlnrsuy ehlnrsuy >> http://abfijlnr.info/?abfijlnr abfijlnr >> https://jkpty.biz/?jkpty jkpty https://cdlpv.biz/?cdlpv cdlpv >> https://ginoxy.ua/?ginoxy ginoxy https://ciknpquy.net/?ci >> ?knpquy ciknpquy https://egijstw.biz/?egijstw egijstw >> http://cdikopvz.info/?cdikopvz cdikopvz >> https://egjklnvy.info/?egjklnvy egjklnvy >> https://fjkmvw.info/?fjkmvw fjkmvw http://dfkmv.com/?dfkmv >> dfkmv https://efgloptu.ru/?efgloptu efgloptu >> https://fgotz.ua/?fgotz fgotz http://efhklv.info/?efhklv >> efhklv https://befmrwy.info/?befmrwy befmrwy >> https://bklnsvy.biz/?bklnsvy bklnsvy http://flmoz.ru/?flmoz >> flmoz http://aiopqrtu.biz/?aiopqrtu aiopqrtu >> https://fjrsty.info/?fjrsty fjrsty >> http://behimuwy.ru/?behimuwy behimuwy >> https://iknqvz.com/?iknqvz iknqvz http://cdemp.ru/?cdemp >> cdemp http://eglpqsvz.net/?eglpqsvz eglpqsvz >> https://dluvz.net/?dluvz dluvz http://beghsuy.net/?beghsuy >> beghsuy http://ekmqsvz.net/?ekmqsvz ekmqsvz >> http://ehinz.net/?ehinz ehinz http://gipvz.com/?gipvz gipvz >> http://efhlwxyz.biz/?efhlwxyz efhlwxyz http://blors.ru/?blors >> blors http://dnqry.info/?dnqry dnqry >> https://cghnrux.biz/?cghnrux cghnrux >> http://dfinqtwx.ru/?dfinqtwx dfinqtwx >> http://efotxz.info/?efotxz efotx >> ?z http://aceglpqu.net/?aceglpqu aceglpqu >> https://abilr.ua/?abilr abilr http://bcgqw.ru/?bcgqw bcgqw >> http://cdinqv.biz/?cdinqv cdinqv https://cdfkp.com/?cdfkp >> cdfkp https://aelouvz.ua/?aelouvz aelouvz >> http://dklstuz.net/?dklstuz dklstuz https://cejqw.ru/?cejqw >> cejqw https://bdgkuy.com/?bdgkuy bdgkuy >> https://hkoprvxy.ru/?hkoprvxy hkoprvxy >> https://filntx.net/?filntx filntx http://cfgsty.ua/?cfgsty >> cfgsty https://bfgptx.net/?bfgptx bfgptx >> http://belpsu.biz/?belpsu belpsu http://deilmoy.biz/?deilmoy >> deilmoy https://kqruv.info/?kqruv kqruv >> https://acdst.net/?acdst acdst http://glmvy.ru/?glmvy glmvy >> https://abcmnu.net/?abcmnu abcmnu >> http://adehinxy.ru/?adehinxy adehinxy >> http://ahmnwy.biz/?ahmnwy ahmnwy http://morst.ru/?morst morst >> http://bfils.ua/?bfils bfils http://abcmnvw.biz/?abcmnvw >> abcmnvw https://bchlqsux.net/?bchlqsux bchlqsux >> http://fhikpqvy.net/?fhikpqvy fhikpqvy >> https://bejkqu.info/?bejkqu bejkqu >> https://cdehnsxy.ru/?cdehnsxy cdehnsxy >> http://bcdhnopx.info/?bcdhnopx bcdhno >> ?px https://elpuz.ua/?elpuz elpuz >> https://bdeftvw.info/?bdeftvw bdeftvw >> http://dgmnou.ru/?dgmnou dgmnou https://aiklnvx.com/?aiklnvx >> aiklnvx http://ijoqrsvw.com/?ijoqrsvw ijoqrsvw >> http://cdehmquy.biz/?cdehmquy cdehmquy >> https://gjoqx.net/?gjoqx gjoqx https://bhlnpqvw.net/?bhlnpqvw >> bhlnpqvw http://bdfiuwyz.net/?bdfiuwyz bdfiuwyz >> https://abesty.info/?abesty abesty >> https://gjlmtuv.net/?gjlmtuv gjlmtuv http://ivxyz.info/?ivxyz >> ivxyz http://aeops.info/?aeops aeops http://acdlv.ru/?acdlv >> acdlv http://fgimqw.net/?fgimqw fgimqw >> https://fgkrxz.info/?fgkrxz fgkrxz >> http://abfhnvxy.ua/?abfhnvxy abfhnvxy >> https://bcdeipsy.ru/?bcdeipsy bcdeipsy >> http://bdimvwy.ru/?bdimvwy bdimvwy http://fnpqry.info/?fnpqry >> fnpqry https://aklstwx.ru/?aklstwx aklstwx >> http://bjloq.com/?bjloq bjloq http://dkmsuv.info/?dkmsuv >> dkmsuv http://afinptuw.ru/?afinptuw afinptuw >> https://bijnqvx.biz/?bijnqvx bijnqvx >> https://behlxz.info/?behlxz behlxz >> http://bgjkquyz.biz/?bgjkquyz bgjkquyz >> http://aehnpqw.net/?aehnpqw aehnpqw >> http://cghjmpqu.com/?cghjmpqu cghjmpqu >> http://aboruy.com/?aboruy aboruy http://fjryz.ru/?fjryz fjryz >> https://cdnqy.info/?cdnqy cdnqy http://bfgjmost.net/?bfgjmost >> bfgjmost https://bhnou.ua/?bhnou bhnou >> https://dhilm.com/?dhilm dhilm http://abijmrtx.net/?abijmrtx >> abijmrtx http://acefnv.com/?acefnv acefnv >> http://gilmtu.com/?gilmtu gilmtu https://egjltuz.biz/?egjltuz >> egjltuz http://afghpsv.net/?afghpsv afghpsv >> http://chkltvx.ua/?chkltvx chkltvx http://befkyz.com/?befkyz >> befkyz https://bcfhmprw.com/?bcfhmprw bcfhmprw >> https://bjklpqs.info/?bjklpqs bjklpqs http://cegpt.ua/?cegpt >> cegpt https://klntvw.biz/?klntvw klntvw >> https://ejkmq.ua/?ejkmq ejkmq https://bgqvx.info/?bgqvx bgqvx >> https://dqrsuvw.com/?dqrsuvw dqrsuvw >> http://bcekrsu.net/?bcekrsu bcekrsu >> https://dlmqwy.com/?dlmqwy dlmqwy http://abhnsvz.ru/?abhnsvz >> abhnsvz https://bdfhjoqr.info/?bdfhjoqr bdfhjoqr >> http://ceijlprw.net/?ceijlprw ceijlprw >> https://ijlqsw.net/?ijlqsw ijlqsw >> https://einrvxz.net/?einrvxz einrvxz https://ikmtx >> z.net/?ikmtxz ikmtxz >> http://adkmqsz.biz/?adkmqsz adkmqsz http://bchnox.com/?bchnox >> bchnox https://ckptvw.com/?ckptvw ckptvw >> http://dgops.com/?dgops dgops https://bfnrtxy.info/?bfnrtxy >> bfnrtxy http://bdnpstu.info/?bdnpstu bdnpstu >> http://dlnprsx.ua/?dlnprsx dlnprsx https://gjnrx.com/?gjnrx >> gjnrx http://abcsvz.com/?abcsvz abcsvz >> http://gknrsv.ru/?gknrsv gknrsv https://fkmouxz.com/?fkmouxz >> fkmouxz http://bcpsx.com/?bcpsx bcpsx >> http://bckoqx.info/?bckoqx bckoqx https://bcdepq.ru/?bcdepq >> bcdepq http://kqswy.net/?kqswy kqswy >> https://bdgjnuy.ru/?bdgjnuy bdgjnuy >> https://cgiklsvz.info/?cgiklsvz cgiklsvz >> https://bfgnprwy.net/?bfgnprwy bfgnprwy >> https://bejmnqrw.com/?bejmnqrw bejmnqrw >> http://gilopqwy.ua/?gilopqwy gilopqwy >> https://fjopry.ru/?fjopry fjopry http://kmrvwy.biz/?kmrvwy >> kmrvwy https://eglotuv.net/?eglotuv eglotuv >> http://cdhnsvwx.com/?cdhnsvwx cdhnsvwx >> http://ahmqv.com/?ahmqv ahmqv https://fkqxy.ua/?fkqxy fkqxy >> https://gjoquvwx.net/?gjoquvwx gjoquvwx >> https://abcfouw.biz/?abcfouw >> ? abcfouw https://abcekn.ru/?abcekn abcekn >> http://ijnruvz.com/?ijnruvz ijnruvz http://mqruv.ua/?mqruv >> mqruv https://afklmu.biz/?afklmu afklmu >> https://abcgptyz.biz/?abcgptyz abcgptyz >> https://bfjsuz.ua/?bfjsuz bfjsuz http://ahmpsx.ua/?ahmpsx >> ahmpsx https://iklns.com/?iklns iklns https://hlnvx.ua/?hlnvx >> hlnvx http://adiklmp.com/?adiklmp adiklmp >> https://fghprs.biz/?fghprs fghprs http://acmopvw.ru/?acmopvw >> acmopvw http://cdmuw.ua/?cdmuw cdmuw >> http://bfkpwxz.biz/?bfkpwxz bfkpwxz http://lqrtwy.ru/?lqrtwy >> lqrtwy https://aclmruvw.biz/?aclmruvw aclmruvw >> http://bcfhmsz.net/?bcfhmsz bcfhmsz >> http://bcgknpry.biz/?bcgknpry bcgknpry >> https://bflmnoq.info/?bflmnoq bflmnoq >> https://bepsy.info/?bepsy bepsy http://gknosuw.net/?gknosuw >> gknosuw http://ajoruvz.com/?ajoruvz ajoruvz >> http://cotxz.ru/?cotxz cotxz https://deintwx.net/?deintwx >> deintwx https://eghijkuy.com/?eghijkuy eghijkuy >> http://dginwx.ua/?dginwx dginwx http://aenoxy.net/?aenoxy >> aenoxy https://dghlpuvx.ru/?dghlpuvx dghlpuvx http://cefor. >> ?info/?cefor cefor https://afioz.biz/?afioz afioz >> https://deklmuvw.com/?deklmuvw deklmuvw >> http://cfgimox.biz/?cfgimox cfgimox >> http://cinquvz.ua/?cinquvz cinquvz https://acilsy.ua/?acilsy >> acilsy http://bfhlnop.net/?bfhlnop bfhlnop >> https://fijluz.net/?fijluz fijluz >> https://dfjnostv.ua/?dfjnostv dfjnostv >> https://aginsty.net/?aginsty aginsty >> http://iknxyz.biz/?iknxyz iknxyz http://agopu.net/?agopu >> agopu http://bdkopst.ua/?bdkopst bdkopst >> https://aghlox.net/?aghlox aghlox https://cdgpu.com/?cdgpu >> cdgpu https://jkmpuwyz.net/?jkmpuwyz jkmpuwyz >> http://ijlmuvw.com/?ijlmuvw ijlmuvw >> https://gkmrtvwy.biz/?gkmrtvwy gkmrtvwy >> http://aefhty.info/?aefhty aefhty >> http://afjklsux.net/?afjklsux afjklsux >> http://mprvyz.biz/?mprvyz mprvyz https://bdjkpx.ua/?bdjkpx >> bdjkpx http://aciknovw.net/?aciknovw aciknovw >> https://aftuxy.ua/?aftuxy aftuxy >> https://ajnpvxyz.ua/?ajnpvxyz ajnpvxyz >> http://fijqrv.com/?fijqrv fijqrv >> https://ahmquvw.info/?ahmquvw ahmquvw >> https://ghklmqvy.com/?ghklmqvy ghklmqvy http://aeh >> kty.com/?aehkty aehkty >> https://aclpsuz.ru/?aclpsuz aclpsuz >> https://ghoqstvy.com/?ghoqstvy ghoqstvy >> https://dghnqrty.ru/?dghnqrty dghnqrty >> https://agorx.com/?agorx agorx https://adejrv.ru/?adejrv >> adejrv http://hkopuvwz.net/?hkopuvwz hkopuvwz >> http://bcdeghvx.ru/?bcdeghvx bcdeghvx >> http://bdfgruvy.com/?bdfgruvy bdfgruvy >> https://cmuvy.biz/?cmuvy cmuvy https://abjkmp.ua/?abjkmp >> abjkmp http://abcfirvz.ru/?abcfirvz abcfirvz >> http://clqruwxy.com/?clqruwxy clqruwxy >> https://fhikoq.net/?fhikoq fhikoq >> https://bfgnsvwz.biz/?bfgnsvwz bfgnsvwz >> http://fhjkor.biz/?fhjkor fhjkor >> https://dfgkouyz.com/?dfgkouyz dfgkouyz >> https://acfhmqtz.info/?acfhmqtz acfhmqtz >> https://ejmnpt.biz/?ejmnpt ejmnpt http://dfgnrtu.ru/?dfgnrtu >> dfgnrtu https://diknswx.biz/?diknswx diknswx >> https://hjnsz.info/?hjnsz hjnsz https://cdjlq.ua/?cdjlq cdjlq >> https://iowxy.ru/?iowxy iowxy https://fhjltux.net/?fhjltux >> fhjltux http://ablmv.com/?ablmv ablmv >> http://bdiorswz.ru/?bdiorswz bdiorswz http://cnopr.ru/?cnopr >> cnopr http://adg >> opz.ru/?adgopz adgopz >> http://bdefimsu.ru/?bdefimsu bdefimsu http://ghmqr.net/?ghmqr >> ghmqr https://cdetv.ru/?cdetv cdetv >> https://bdkortu.biz/?bdkortu bdkortu http://djopv.biz/?djopv >> djopv http://aiknorw.biz/?aiknorw aiknorw >> https://bfmqr.net/?bfmqr bfmqr http://klmox.com/?klmox klmox >> https://fimqsv.biz/?fimqsv fimqsv https://fkopq.ru/?fkopq >> fkopq http://aglpx.com/?aglpx aglpx http://dfhquv.biz/?dfhquv >> dfhquv https://bfgjoq.com/?bfgjoq bfgjoq >> https://cklnz.ru/?cklnz cklnz http://cfijqrv.net/?cfijqrv >> cfijqrv https://aekuvxz.net/?aekuvxz aekuvxz >> https://ghijkstx.info/?ghijkstx ghijkstx >> http://abilmsz.info/?abilmsz abilmsz >> https://bfhsvyz.ru/?bfhsvyz bfhsvyz http://eqsuv.ua/?eqsuv >> eqsuv http://ghinqsx.biz/?ghinqsx ghinqsx >> https://acgkmnpu.com/?acgkmnpu acgkmnpu >> http://elrty.ru/?elrty elrty https://cdgkoz.net/?cdgkoz >> cdgkoz http://dimnosw.info/?dimnosw dimnosw >> http://abegnwx.biz/?abegnwx abegnwx >> http://abejnsx.net/?abejnsx abejnsx >> https://emnorxz.ru/?emnorxz emnorxz http://abfsz.u >> ?a/?abfsz abfsz https://imqrvy.net/?imqrvy imqrvy >> https://cdgjmqrv.com/?cdgjmqrv cdgjmqrv >> http://bdfostyz.net/?bdfostyz bdfostyz >> https://lnorxz.biz/?lnorxz lnorxz http://afgruvw.ru/?afgruvw >> afgruvw https://cefivyz.com/?cefivyz cefivyz >> https://gmqrt.com/?gmqrt gmqrt >> https://acdimqrw.info/?acdimqrw acdimqrw >> http://abchjm.ru/?abchjm abchjm http://alnqs.biz/?alnqs alnqs >> https://ijqwz.ua/?ijqwz ijqwz https://adfjstx.biz/?adfjstx >> adfjstx https://cghlqt.net/?cghlqt cghlqt >> http://ijrvyz.ua/?ijrvyz ijrvyz https://ahluv.net/?ahluv >> ahluv http://acdfgn.net/?acdfgn acdfgn >> https://hijnru.ua/?hijnru hijnru https://ghnuvy.ru/?ghnuvy >> ghnuvy http://aeiky.com/?aeiky aeiky >> https://bdilnov.info/?bdilnov bdilnov >> https://cjlnpqru.biz/?cjlnpqru cjlnpqru >> https://cgjqtv.biz/?cgjqtv cgjqtv >> https://abcnsvz.biz/?abcnsvz abcnsvz >> http://abegopsu.biz/?abegopsu abegopsu >> https://bcmstuvy.net/?bcmstuvy bcmstuvy >> https://bhjknsuz.net/?bhjknsuz bhjknsuz >> http://aceklno.info/?aceklno aceklno http://diotvyz.ru/?diot >> ?vyz diotvyz https://aefkmruw.info/?aefkmruw aefkmruw >> https://cdghlmnz.info/?cdghlmnz cdghlmnz >> https://bcgijmow.net/?bcgijmow bcgijmow >> http://moruwx.ua/?moruwx moruwx https://bhjnrt.ru/?bhjnrt >> bhjnrt https://eimoptu.ua/?eimoptu eimoptu >> https://achptv.info/?achptv achptv https://boptux.biz/?boptux >> boptux http://acmovz.ua/?acmovz acmovz >> https://abfmnpw.info/?abfmnpw abfmnpw >> https://afiknvz.biz/?afiknvz afiknvz http://ehnps.ua/?ehnps >> ehnps https://dnpuw.biz/?dnpuw dnpuw >> https://abflmns.biz/?abflmns abflmns https://fmuyz.biz/?fmuyz >> fmuyz http://bejkvz.info/?bejkvz bejkvz >> https://dfkmnyz.ua/?dfkmnyz dfkmnyz >> http://cfgnsvx.biz/?cfgnsvx cfgnsvx http://abdjsw.ru/?abdjsw >> abdjsw https://abelrvx.biz/?abelrvx abelrvx >> https://abjmoy.ru/?abjmoy abjmoy http://hmnqz.net/?hmnqz >> hmnqz https://dhkntv.ru/?dhkntv dhkntv >> http://bgqsyz.biz/?bgqsyz bgqsyz >> http://eghkmsux.net/?eghkmsux eghkmsux >> https://acehlrsv.info/?acehlrsv acehlrsv >> https://lnpruvy.com/?lnpruvy lnpruvy http://cefmtu.ua/?cefmtu >> cefm >> ?tu https://hjnqrt.net/?hjnqrt hjnqrt >> http://ceiktvw.ru/?ceiktvw ceiktvw http://cfjmu.info/?cfjmu >> cfjmu https://cfhmn.com/?cfhmn cfhmn http://cdfqr.net/?cdfqr >> cdfqr http://adelmvz.ru/?adelmvz adelmvz >> http://abflnsvx.com/?abflnsvx abflnsvx >> http://bjpqv.biz/?bjpqv bjpqv https://fjpqux.biz/?fjpqux >> fjpqux http://fpqtvyz.net/?fpqtvyz fpqtvyz >> http://cdflstwz.net/?cdflstwz cdflstwz >> https://egjostux.ua/?egjostux egjostux >> https://cjlnpy.ua/?cjlnpy cjlnpy https://bcfgmz.com/?bcfgmz >> bcfgmz http://dfnouz.ua/?dfnouz dfnouz >> http://cfikqv.net/?cfikqv cfikqv https://dehkqw.net/?dehkqw >> dehkqw http://fhkqs.com/?fhkqs fhkqs http://dijkl.com/?dijkl >> dijkl http://aeglu.com/?aeglu aeglu >> http://dgikmpy.info/?dgikmpy dgikmpy >> https://brsuz.info/?brsuz brsuz http://acfglopu.ru/?acfglopu >> acfglopu https://cdjqx.net/?cdjqx cdjqx >> https://aefimst.biz/?aefimst aefimst >> http://bfikmuw.net/?bfikmuw bfikmuw >> http://abhioqvw.ua/?abhioqvw abhioqvw >> http://abdilmo.net/?abdilmo abdilmo >> https://bhlovx.info/?bhlovx bhlo >> ?vx https://fgvwxz.info/?fgvwxz fgvwxz >> http://afhjpqsv.biz/?afhjpqsv afhjpqsv >> http://deilrsv.com/?deilrsv deilrsv >> http://cijlmnt.ru/?cijlmnt cijlmnt https://bfkmtw.ua/?bfkmtw >> bfkmtw https://adkxy.ru/?adkxy adkxy >> https://blpqxy.com/?blpqxy blpqxy >> https://efgilor.net/?efgilor efgilor >> https://abcdky.info/?abcdky abcdky >> https://befgikqu.ru/?befgikqu befgikqu >> https://ipsuvwy.com/?ipsuvwy ipsuvwy https://fjlsu.biz/?fjlsu >> fjlsu https://kpquw.com/?kpquw kpquw >> https://bfjkoqvy.net/?bfjkoqvy bfjkoqvy >> http://dijopsz.biz/?dijopsz dijopsz >> http://acfhrsz.ua/?acfhrsz acfhrsz >> https://bfhrstvy.ua/?bfhrstvy bfhrstvy >> http://afglw.info/?afglw afglw https://bcjqrvw.net/?bcjqrvw >> bcjqrvw https://hiotz.ru/?hiotz hiotz >> https://abefmswx.biz/?abefmswx abefmswx >> http://acgkopyz.ua/?acgkopyz acgkopyz >> https://fijln.com/?fijln fijln http://befmquwz.info/?befmquwz >> befmquwz https://hijory.ua/?hijory hijory >> http://cdfhr.info/?cdfhr cdfhr https://ahiksx.ua/?ahiksx >> ahiksx https://aghmxy.ru/?aghmxy aghmxy https:/ >> ?/hnopqt.net/?hnopqt hnopqt >> https://fiklnrsw.ru/?fiklnrsw fiklnrsw http://hknpv.ru/?hknpv >> hknpv https://abdfmotw.net/?abdfmotw abdfmotw >> http://bcfhpvx.biz/?bcfhpvx bcfhpvx >> http://beimrty.net/?beimrty beimrty >> http://dgnsvwy.ua/?dgnsvwy dgnsvwy >> https://aghjstvw.com/?aghjstvw aghjstvw >> http://gpqsuxy.ru/?gpqsuxy gpqsuxy https://finop.info/?finop >> finop http://bcdhsy.biz/?bcdhsy bcdhsy >> http://cfhmnvy.net/?cfhmnvy cfhmnvy >> https://hjopuw.net/?hjopuw hjopuw https://akmou.biz/?akmou >> akmou https://fgkntuw.biz/?fgkntuw fgkntuw >> https://adfjsw.com/?adfjsw adfjsw >> https://bdjmrsty.biz/?bdjmrsty bdjmrsty >> http://grtuz.ru/?grtuz grtuz https://cgiqw.info/?cgiqw cgiqw >> http://acenpvw.ru/?acenpvw acenpvw https://dmnrz.ru/?dmnrz >> dmnrz http://deglqtu.com/?deglqtu deglqtu >> http://ilpqrtuw.com/?ilpqrtuw ilpqrtuw >> https://gnoqvxz.com/?gnoqvxz gnoqvxz >> https://hijkmsu.ru/?hijkmsu hijkmsu >> http://cistxyz.biz/?cistxyz cistxyz >> https://adklmns.ru/?adklmns adklmns https://hmqrwy.ua/?hmqrwy >> hmqrwy https://crsuv.ru >> ?/?crsuv crsuv http://abejkmoz.ua/?abejkmoz abejkmoz >> http://abjpqu.ua/?abjpqu abjpqu >> https://ehjkopuz.info/?ehjkopuz ehjkopuz >> http://iovwz.ua/?iovwz iovwz https://emntvwx.net/?emntvwx >> emntvwx http://eghksx.info/?eghksx eghksx >> https://bmnps.com/?bmnps bmnps https://cinopuv.net/?cinopuv >> cinopuv http://efijnptz.biz/?efijnptz efijnptz >> http://cehilnp.net/?cehilnp cehilnp http://denprw.ru/?denprw >> denprw http://adglpwz.net/?adglpwz adglpwz >> http://dehimosu.biz/?dehimosu dehimosu >> https://abchprwy.com/?abchprwy abchprwy >> http://dlnory.info/?dlnory dlnory http://cehnwyz.com/?cehnwyz >> cehnwyz http://aghlopy.ua/?aghlopy aghlopy >> http://fhjksuvz.com/?fhjksuvz fhjksuvz >> http://bijmy.com/?bijmy bijmy https://adhjw.com/?adhjw adhjw >> https://emnprwyz.info/?emnprwyz emnprwyz >> http://dgmnw.ru/?dgmnw dgmnw https://cdgimruv.com/?cdgimruv >> cdgimruv http://bcdjqst.biz/?bcdjqst bcdjqst >> https://lmquv.ru/?lmquv lmquv http://degiptv.info/?degiptv >> degiptv http://hlnvy.ru/?hlnvy hlnvy http://gklmvx.info/?gklmvx >> ?gklmvx http://behlpstz.com/?behlpstz behlpstz >> https://efginz.com/?efginz efginz >> https://deilmnru.biz/?deilmnru deilmnru >> http://bdehoqux.ru/?bdehoqux bdehoqux >> http://doqvy.info/?doqvy doqvy http://jlorw.info/?jlorw jlorw >> https://cfmtvy.net/?cfmtvy cfmtvy >> http://agkmqvxy.biz/?agkmqvxy agkmqvxy >> https://ehnrux.ru/?ehnrux ehnrux https://ahlmpyz.biz/?ahlmpyz >> ahlmpyz https://dhilqxyz.com/?dhilqxyz dhilqxyz >> http://beosz.biz/?beosz beosz http://jkmtvx.ru/?jkmtvx jkmtvx >> http://aefiuw.ua/?aefiuw aefiuw http://bgimy.info/?bgimy >> bgimy https://acdfhimy.info/?acdfhimy acdfhimy >> http://bdfpv.info/?bdfpv bdfpv https://lnopux.info/?lnopux >> lnopux https://iknqstux.ua/?iknqstux iknqstux >> https://binxy.ru/?binxy binxy http://abflsv.net/?abflsv >> abflsv http://cghkmtvz.info/?cghkmtvz cghkmtvz >> http://flpqrxz.biz/?flpqrxz flpqrxz https://cfgnvz.ua/?cfgnvz >> cfgnvz https://jtuvx.com/?jtuvx jtuvx https://fgikl.ru/?fgikl >> fgikl http://ehikrs.ru/?ehikrs ehikrs >> http://beghqx.ru/?beghqx beghqx http://dhijlrsw.co >> ?m/?dhijlrsw dhijlrsw http://elqst.ru/?elqst elqst >> https://abfklmns.biz/?abfklmns abfklmns >> https://cilnorw.ua/?cilnorw cilnorw https://anoqu.ua/?anoqu >> anoqu http://adikquz.com/?adikquz adikquz >> http://abcpwyz.ua/?abcpwyz abcpwyz https://fiqwxz.ru/?fiqwxz >> fiqwxz http://bcivw.ru/?bcivw bcivw >> https://acdfnrz.ua/?acdfnrz acdfnrz >> https://jmopqtw.info/?jmopqtw jmopqtw >> https://abdeghtv.biz/?abdeghtv abdeghtv >> http://aceoprw.net/?aceoprw aceoprw http://forvy.ua/?forvy >> forvy https://abgjknpv.com/?abgjknpv abgjknpv >> http://bcdgimr.info/?bcdgimr bcdgimr >> https://flntvw.biz/?flntvw flntvw http://bcdnqvx.net/?bcdnqvx >> bcdnqvx http://defjnqvw.com/?defjnqvw defjnqvw >> http://ekmouvwy.biz/?ekmouvwy ekmouvwy >> http://cijpqsxz.biz/?cijpqsxz cijpqsxz >> http://dopstyz.ua/?dopstyz dopstyz http://cinow.com/?cinow >> cinow http://cdejmrs.info/?cdejmrs cdejmrs >> http://fgknprwz.info/?fgknprwz fgknprwz >> https://binpquxy.com/?binpquxy binpquxy >> https://afhqrtu.biz/?afhqrtu afhqrtu http://cemqtv.ru/?cemqtv >> cemqtv http:/ >> ?/bgnqsz.info/?bgnqsz bgnqsz >> http://cdfjprsv.ua/?cdfjprsv cdfjprsv >> https://jmqstz.biz/?jmqstz jmqstz http://adhuw.info/?adhuw >> adhuw http://deswz.ua/?deswz deswz https://ehkmnw.com/?ehkmnw >> ehkmnw http://kmsxy.ua/?kmsxy kmsxy >> http://benotvw.info/?benotvw benotvw >> https://dhinops.biz/?dhinops dhinops >> https://hklnpqtv.ru/?hklnpqtv hklnpqtv >> https://gjnqrtwx.info/?gjnqrtwx gjnqrtwx >> https://dortvw.biz/?dortvw dortvw http://cdghjp.com/?cdghjp >> cdghjp http://abchnoqx.com/?abchnoqx abchnoqx >> http://abnsz.biz/?abnsz abnsz https://acdnqwy.info/?acdnqwy >> acdnqwy http://befilpv.com/?befilpv befilpv >> http://fijrux.biz/?fijrux fijrux >> https://aknqrtuz.ru/?aknqrtuz aknqrtuz >> http://abfoqwz.ru/?abfoqwz abfoqwz >> https://defikv.info/?defikv defikv >> http://abcoqtz.com/?abcoqtz abcoqtz >> https://dfirxz.net/?dfirxz dfirxz >> http://bdfipqux.net/?bdfipqux bdfipqux >> http://bcgqx.net/?bcgqx bcgqx https://bdfhl.ua/?bdfhl bdfhl >> http://ahikmpqt.info/?ahikmpqt ahikmpqt >> https://aghqrt.com/?aghqrt aghqrt https://afhosxz.r >> ?u/?afhosxz afhosxz https://ehsvx.ru/?ehsvx ehsvx >> http://gknquvw.info/?gknquvw gknquvw >> https://gilmpqw.ru/?gilmpqw gilmpqw http://fhopy.biz/?fhopy >> fhopy https://bcmpux.net/?bcmpux bcmpux >> https://adfmrsuy.info/?adfmrsuy adfmrsuy >> https://fmtuvx.net/?fmtuvx fmtuvx >> https://defjmswy.net/?defjmswy defjmswy >> https://ahijmrsz.info/?ahijmrsz ahijmrsz >> https://cerwx.biz/?cerwx cerwx https://behqz.biz/?behqz behqz >> https://behioqyz.ru/?behioqyz behioqyz >> https://admrsvxz.com/?admrsvxz admrsvxz >> https://cefhipqw.com/?cefhipqw cefhipqw >> http://deopsx.ru/?deopsx deopsx http://acfgksz.info/?acfgksz >> acfgksz http://hjlmqst.info/?hjlmqst hjlmqst >> http://efgnrsux.net/?efgnrsux efgnrsux >> http://adflovz.info/?adflovz adflovz http://acopu.ua/?acopu >> acopu https://chilnqrs.ru/?chilnqrs chilnqrs >> https://blosy.net/?blosy blosy https://gijnpuxy.com/?gijnpuxy >> gijnpuxy https://bcemuvwz.ua/?bcemuvwz bcemuvwz >> http://bdfqwx.net/?bdfqwx bdfqwx http://dfghilr.net/?dfghilr >> dfghilr http://bdlqstv.com/?bdlqstv bdlqstv h >> ?ttp://adhjnrsz.com/?adhjnrsz >> adhjnrsz http://adiopqx.com/?adiopqx adiopqx >> http://ehnoqw.info/?ehnoqw ehnoqw https://dgpsx.net/?dgpsx >> dgpsx http://afgrt.ru/?afgrt afgrt >> http://bghipuvz.ua/?bghipuvz bghipuvz >> http://egjmvw.com/?egjmvw egjmvw >> https://eginoswz.com/?eginoswz eginoswz >> http://bekuv.info/?bekuv bekuv http://bglno.info/?bglno bglno >> http://hkoptxz.com/?hkoptxz hkoptxz https://aikqz.biz/?aikqz >> aikqz https://oqvxy.net/?oqvxy oqvxy http://kruwy.net/?kruwy >> kruwy http://fhlqty.net/?fhlqty fhlqty >> http://chpuvyz.ru/?chpuvyz chpuvyz >> https://aciloqt.ua/?aciloqt aciloqt http://cklsw.info/?cklsw >> cklsw https://fghkmvy.info/?fghkmvy fghkmvy >> http://belmpvz.biz/?belmpvz belmpvz http://eimpxz.ru/?eimpxz >> eimpxz http://cdjstuz.info/?cdjstuz cdjstuz >> https://adivy.com/?adivy adivy https://alnrty.ru/?alnrty >> alnrty https://hkmntz.info/?hkmntz hkmntz >> https://dkpqr.com/?dkpqr dkpqr https://bfntv.ru/?bfntv bfntv >> https://inuvwyz.ru/?inuvwyz inuvwyz >> https://jkmrvyz.ua/?jkmrvyz jkmrvyz http://bfi >> pruw.biz/?bfipruw bfipruw >> http://cjklos.ua/?cjklos cjklos http://afglru.ru/?afglru >> afglru http://defrt.ua/?defrt defrt >> https://afghsvy.ua/?afghsvy afghsvy >> http://cehiou.info/?cehiou cehiou https://mnrvz.com/?mnrvz >> mnrvz http://fgtuxz.biz/?fgtuxz fgtuxz >> http://egiqtuy.ua/?egiqtuy egiqtuy https://dilnv.ru/?dilnv >> dilnv http://abeny.biz/?abeny abeny http://aijvxz.biz/?aijvxz >> aijvxz https://abdhrsx.net/?abdhrsx abdhrsx >> http://bfgmrxy.ru/?bfgmrxy bfgmrxy https://fjkow.info/?fjkow >> fjkow https://fkruvx.biz/?fkruvx fkruvx >> http://efhjsw.biz/?efhjsw efhjsw http://djkmops.info/?djkmops >> djkmops https://cequyz.ru/?cequyz cequyz >> https://hmswz.com/?hmswz hmswz http://lnotu.info/?lnotu lnotu >> http://bdjqu.info/?bdjqu bdjqu http://adejrxy.ua/?adejrxy >> adejrxy http://cdgikmps.ru/?cdgikmps cdgikmps >> http://dhmostwy.net/?dhmostwy dhmostwy >> https://cdhlquz.ua/?cdhlquz cdhlquz https://bilnor.ua/?bilnor >> bilnor https://fnqxz.com/?fnqxz fnqxz >> https://bijnuwz.com/?bijnuwz bijnuwz http://abcgikwy.ua/?abcgi >> ?kwy abcgikwy https://begimrv.ua/?begimrv begimrv >> https://jlprv.net/?jlprv jlprv http://efhsz.ua/?efhsz efhsz >> https://dgiqs.ua/?dgiqs dgiqs http://bgiprx.net/?bgiprx >> bgiprx https://cdklmqv.biz/?cdklmqv cdklmqv >> https://adgnowz.ua/?adgnowz adgnowz >> http://cfjnorty.ru/?cfjnorty cfjnorty >> https://bcfhpqvy.ru/?bcfhpqvy bcfhpqvy >> http://beiqv.biz/?beiqv beiqv https://bjmnu.com/?bjmnu bjmnu >> https://fgjoquvz.ru/?fgjoquvz fgjoquvz >> http://fkpwx.info/?fkpwx fkpwx http://dfghty.com/?dfghty >> dfghty https://dhlmtvxy.biz/?dhlmtvxy dhlmtvxy >> https://bejuy.com/?bejuy bejuy http://cdikry.com/?cdikry >> cdikry https://bfhkrty.info/?bfhkrty bfhkrty >> https://cekpst.com/?cekpst cekpst https://ceknpx.ru/?ceknpx >> ceknpx https://gjpwy.com/?gjpwy gjpwy >> https://ceouwxyz.ru/?ceouwxyz ceouwxyz >> https://ijkopstu.ua/?ijkopstu ijkopstu >> https://acegjlmo.net/?acegjlmo acegjlmo >> http://bflnpv.info/?bflnpv bflnpv http://fijkotz.ua/?fijkotz >> fijkotz http://ajkptwyz.com/?ajkptwyz ajkptwyz >> http://adjpvw.net/?adjpvw adjpvw http >> ?s://dpqrsv.biz/?dpqrsv dpqrsv >> https://acehklps.ua/?acehklps acehklps >> http://aghsuvyz.biz/?aghsuvyz aghsuvyz >> https://akotu.ru/?akotu akotu https://dkqrt.biz/?dkqrt dkqrt >> http://dlrtuxz.info/?dlrtuxz dlrtuxz http://fstvx.ua/?fstvx >> fstvx http://ejmsv.ua/?ejmsv ejmsv http://abfmyz.com/?abfmyz >> abfmyz http://cdiqrsvx.info/?cdiqrsvx cdiqrsvx >> https://fqsuz.info/?fqsuz fqsuz https://cilsu.info/?cilsu >> cilsu https://bjklnsxz.net/?bjklnsxz bjklnsxz >> https://demnpst.biz/?demnpst demnpst >> https://bghikms.biz/?bghikms bghikms >> http://achjtu.biz/?achjtu achjtu http://npqrwz.net/?npqrwz >> npqrwz http://adehilpq.net/?adehilpq adehilpq >> http://bnortwy.com/?bnortwy bnortwy >> http://cegpsuz.ru/?cegpsuz cegpsuz http://aksyz.ua/?aksyz >> aksyz http://egkmosux.biz/?egkmosux egkmosux >> https://dimoruy.com/?dimoruy dimoruy >> https://ehopyz.net/?ehopyz ehopyz >> https://ehijptuy.com/?ehijptuy ehijptuy >> http://jklmprtv.ua/?jklmprtv jklmprtv >> http://abmnrw.ru/?abmnrw abmnrw http://abcfqu.ru/?abcfqu >> abcfqu https://degijyz.r >> ?u/?degijyz degijyz https://cikortuy.net/?cikortuy cikortuy >> http://abgnrw.ua/?abgnrw abgnrw http://bfgknpqt.com/?bfgknpqt >> bfgknpqt https://afmqtv.ru/?afmqtv afmqtv >> https://bertuvwz.info/?bertuvwz bertuvwz >> https://eipuvwxy.com/?eipuvwxy eipuvwxy >> http://ahjory.info/?ahjory ahjory http://acefvx.com/?acefvx >> acefvx http://acdlo.com/?acdlo acdlo http://cdjqs.biz/?cdjqs >> cdjqs http://bdexy.net/?bdexy bdexy http://bktxz.info/?bktxz >> bktxz http://adegmovy.biz/?adegmovy adegmovy >> http://fkxyz.ua/?fkxyz fkxyz >> >> Hey handsome. I?ve just, only watched your pictures. You?re >> sex. I am so, very bored today and I want to offer you >> talking. My profile is here >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: >> >> >> ------------------------------ >> >> Subject: Digest Footer >> >> _______________________________________________ >> petsc-users mailing list >> petsc-users at mcs.anl.gov >> https://lists.mcs.anl.gov/mailman/listinfo/petsc-users >> >> >> ------------------------------ >> >> End of petsc-users Digest, Vol 119, Issue 89 >> ******************************************** >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- Build unstructured mesh, done. vector view - petsc order Vec Object: 4 MPI processes type: mpi Process [0] 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. Process [1] 49. 50. 51. 52. 53. 54. 55. 56. 57. 58. 59. 60. 61. 62. 63. 64. 65. 66. 67. 68. 69. 70. 71. 72. 73. 74. 75. 76. 77. 78. 79. 80. 81. 82. 83. 84. 85. 86. 87. 88. 89. 90. 91. 92. 93. 94. 95. 96. 97. 98. 99. 100. 101. 102. 103. 104. Process [2] 105. 106. 107. 108. 109. 110. 111. 112. 113. 114. 115. 116. 117. 118. 119. 120. 121. 122. 123. 124. 125. 126. 127. 128. 129. 130. 131. 132. 133. 134. 135. 136. 137. 138. 139. 140. 141. 142. 143. 144. 145. 146. 147. 148. 149. 150. 151. 152. 153. 154. 155. 156. 157. 158. 159. Process [3] 160. 161. 162. 163. 164. 165. 166. 167. 168. 169. 170. 171. 172. 173. 174. 175. 176. 177. 178. 179. 180. 181. 182. 183. 184. 185. 186. 187. 188. 189. 190. 191. 192. 193. 194. 195. 196. 197. 198. 199. 200. 201. 202. 203. 204. 205. 206. 207. 208. 209. 210. 211. 212. 213. 214. 215. 216. 217. 218. 219. 220. 221. 222. 223. check migration SF PetscSF Object: 4 MPI processes type: basic sort=rank-order [0] Number of roots=617, leaves=208, remote ranks=1 [0] 0 <- (0,0) [0] 1 <- (0,4) [0] 2 <- (0,12) [0] 3 <- (0,14) [0] 4 <- (0,18) [0] 5 <- (0,21) [0] 6 <- (0,31) [0] 7 <- (0,33) [0] 8 <- (0,34) [0] 9 <- (0,37) [0] 10 <- (0,39) [0] 11 <- (0,41) [0] 12 <- (0,61) [0] 13 <- (0,74) [0] 14 <- (0,81) [0] 15 <- (0,83) [0] 16 <- (0,85) [0] 17 <- (0,86) [0] 18 <- (0,87) [0] 19 <- (0,93) [0] 20 <- (0,94) [0] 21 <- (0,95) [0] 22 <- (0,96) [0] 23 <- (0,98) [0] 24 <- (0,100) [0] 25 <- (0,103) [0] 26 <- (0,108) [0] 27 <- (0,111) [0] 28 <- (0,113) [0] 29 <- (0,118) [0] 30 <- (0,151) [0] 31 <- (0,152) [0] 32 <- (0,154) [0] 33 <- (0,160) [0] 34 <- (0,161) [0] 35 <- (0,166) [0] 36 <- (0,176) [0] 37 <- (0,178) [0] 38 <- (0,204) [0] 39 <- (0,207) [0] 40 <- (0,213) [0] 41 <- (0,216) [0] 42 <- (0,220) [0] 43 <- (0,223) [0] 44 <- (0,224) [0] 45 <- (0,225) [0] 46 <- (0,227) [0] 47 <- (0,231) [0] 48 <- (0,233) [0] 49 <- (0,240) [0] 50 <- (0,241) [0] 51 <- (0,250) [0] 52 <- (0,267) [0] 53 <- (0,269) [0] 54 <- (0,273) [0] 55 <- (0,275) [0] 56 <- (0,280) [0] 57 <- (0,281) [0] 58 <- (0,291) [0] 59 <- (0,296) [0] 60 <- (0,298) [0] 61 <- (0,303) [0] 62 <- (0,308) [0] 63 <- (0,313) [0] 64 <- (0,318) [0] 65 <- (0,323) [0] 66 <- (0,324) [0] 67 <- (0,338) [0] 68 <- (0,339) [0] 69 <- (0,345) [0] 70 <- (0,346) [0] 71 <- (0,347) [0] 72 <- (0,350) [0] 73 <- (0,351) [0] 74 <- (0,353) [0] 75 <- (0,354) [0] 76 <- (0,355) [0] 77 <- (0,356) [0] 78 <- (0,360) [0] 79 <- (0,362) [0] 80 <- (0,364) [0] 81 <- (0,365) [0] 82 <- (0,366) [0] 83 <- (0,368) [0] 84 <- (0,369) [0] 85 <- (0,370) [0] 86 <- (0,373) [0] 87 <- (0,375) [0] 88 <- (0,377) [0] 89 <- (0,378) [0] 90 <- (0,379) [0] 91 <- (0,380) [0] 92 <- (0,381) [0] 93 <- (0,384) [0] 94 <- (0,385) [0] 95 <- (0,386) [0] 96 <- (0,387) [0] 97 <- (0,388) [0] 98 <- (0,390) [0] 99 <- (0,391) [0] 100 <- (0,11) [0] 101 <- (0,27) [0] 102 <- (0,48) [0] 103 <- (0,56) [0] 104 <- (0,70) [0] 105 <- (0,104) [0] 106 <- (0,124) [0] 107 <- (0,125) [0] 108 <- (0,128) [0] 109 <- (0,130) [0] 110 <- (0,260) [0] 111 <- (0,278) [0] 112 <- (0,292) [0] 113 <- (0,315) [0] 114 <- (0,316) [0] 115 <- (0,321) [0] 116 <- (0,340) [0] 117 <- (0,341) [0] 118 <- (0,17) [0] 119 <- (0,22) [0] 120 <- (0,50) [0] 121 <- (0,66) [0] 122 <- (0,69) [0] 123 <- (0,150) [0] 124 <- (0,157) [0] 125 <- (0,169) [0] 126 <- (0,257) [0] 127 <- (0,270) [0] 128 <- (0,322) [0] 129 <- (0,397) [0] 130 <- (0,405) [0] 131 <- (0,420) [0] 132 <- (0,421) [0] 133 <- (0,422) [0] 134 <- (0,423) [0] 135 <- (0,437) [0] 136 <- (0,438) [0] 137 <- (0,439) [0] 138 <- (0,440) [0] 139 <- (0,441) [0] 140 <- (0,442) [0] 141 <- (0,460) [0] 142 <- (0,461) [0] 143 <- (0,469) [0] 144 <- (0,471) [0] 145 <- (0,477) [0] 146 <- (0,485) [0] 147 <- (0,487) [0] 148 <- (0,492) [0] 149 <- (0,493) [0] 150 <- (0,494) [0] 151 <- (0,498) [0] 152 <- (0,506) [0] 153 <- (0,514) [0] 154 <- (0,522) [0] 155 <- (0,525) [0] 156 <- (0,532) [0] 157 <- (0,533) [0] 158 <- (0,538) [0] 159 <- (0,542) [0] 160 <- (0,545) [0] 161 <- (0,551) [0] 162 <- (0,564) [0] 163 <- (0,571) [0] 164 <- (0,573) [0] 165 <- (0,584) [0] 166 <- (0,585) [0] 167 <- (0,591) [0] 168 <- (0,593) [0] 169 <- (0,594) [0] 170 <- (0,600) [0] 171 <- (0,603) [0] 172 <- (0,605) [0] 173 <- (0,606) [0] 174 <- (0,607) [0] 175 <- (0,613) [0] 176 <- (0,616) [0] 177 <- (0,398) [0] 178 <- (0,399) [0] 179 <- (0,400) [0] 180 <- (0,401) [0] 181 <- (0,404) [0] 182 <- (0,424) [0] 183 <- (0,425) [0] 184 <- (0,455) [0] 185 <- (0,456) [0] 186 <- (0,457) [0] 187 <- (0,501) [0] 188 <- (0,512) [0] 189 <- (0,520) [0] 190 <- (0,544) [0] 191 <- (0,549) [0] 192 <- (0,576) [0] 193 <- (0,596) [0] 194 <- (0,612) [0] 195 <- (0,435) [0] 196 <- (0,436) [0] 197 <- (0,459) [0] 198 <- (0,470) [0] 199 <- (0,473) [0] 200 <- (0,495) [0] 201 <- (0,521) [0] 202 <- (0,531) [0] 203 <- (0,536) [0] 204 <- (0,539) [0] 205 <- (0,550) [0] 206 <- (0,566) [0] 207 <- (0,604) [1] Number of roots=0, leaves=217, remote ranks=1 [1] 0 <- (0,1) [1] 1 <- (0,5) [1] 2 <- (0,6) [1] 3 <- (0,11) [1] 4 <- (0,19) [1] 5 <- (0,20) [1] 6 <- (0,25) [1] 7 <- (0,26) [1] 8 <- (0,27) [1] 9 <- (0,28) [1] 10 <- (0,29) [1] 11 <- (0,32) [1] 12 <- (0,40) [1] 13 <- (0,42) [1] 14 <- (0,43) [1] 15 <- (0,44) [1] 16 <- (0,47) [1] 17 <- (0,48) [1] 18 <- (0,56) [1] 19 <- (0,57) [1] 20 <- (0,59) [1] 21 <- (0,60) [1] 22 <- (0,62) [1] 23 <- (0,63) [1] 24 <- (0,70) [1] 25 <- (0,71) [1] 26 <- (0,75) [1] 27 <- (0,76) [1] 28 <- (0,78) [1] 29 <- (0,88) [1] 30 <- (0,90) [1] 31 <- (0,104) [1] 32 <- (0,121) [1] 33 <- (0,124) [1] 34 <- (0,125) [1] 35 <- (0,128) [1] 36 <- (0,129) [1] 37 <- (0,130) [1] 38 <- (0,132) [1] 39 <- (0,133) [1] 40 <- (0,136) [1] 41 <- (0,142) [1] 42 <- (0,145) [1] 43 <- (0,147) [1] 44 <- (0,163) [1] 45 <- (0,165) [1] 46 <- (0,180) [1] 47 <- (0,181) [1] 48 <- (0,186) [1] 49 <- (0,187) [1] 50 <- (0,190) [1] 51 <- (0,198) [1] 52 <- (0,208) [1] 53 <- (0,210) [1] 54 <- (0,211) [1] 55 <- (0,212) [1] 56 <- (0,218) [1] 57 <- (0,228) [1] 58 <- (0,243) [1] 59 <- (0,244) [1] 60 <- (0,245) [1] 61 <- (0,248) [1] 62 <- (0,249) [1] 63 <- (0,253) [1] 64 <- (0,256) [1] 65 <- (0,258) [1] 66 <- (0,260) [1] 67 <- (0,261) [1] 68 <- (0,274) [1] 69 <- (0,278) [1] 70 <- (0,282) [1] 71 <- (0,283) [1] 72 <- (0,284) [1] 73 <- (0,285) [1] 74 <- (0,287) [1] 75 <- (0,292) [1] 76 <- (0,295) [1] 77 <- (0,299) [1] 78 <- (0,309) [1] 79 <- (0,315) [1] 80 <- (0,316) [1] 81 <- (0,321) [1] 82 <- (0,327) [1] 83 <- (0,331) [1] 84 <- (0,332) [1] 85 <- (0,335) [1] 86 <- (0,336) [1] 87 <- (0,337) [1] 88 <- (0,340) [1] 89 <- (0,341) [1] 90 <- (0,342) [1] 91 <- (0,343) [1] 92 <- (0,344) [1] 93 <- (0,359) [1] 94 <- (0,361) [1] 95 <- (0,382) [1] 96 <- (0,392) [1] 97 <- (0,93) [1] 98 <- (0,154) [1] 99 <- (0,178) [1] 100 <- (0,204) [1] 101 <- (0,281) [1] 102 <- (0,291) [1] 103 <- (0,296) [1] 104 <- (0,298) [1] 105 <- (0,303) [1] 106 <- (0,308) [1] 107 <- (0,318) [1] 108 <- (0,339) [1] 109 <- (0,350) [1] 110 <- (0,351) [1] 111 <- (0,353) [1] 112 <- (0,355) [1] 113 <- (0,370) [1] 114 <- (0,379) [1] 115 <- (0,380) [1] 116 <- (0,388) [1] 117 <- (0,49) [1] 118 <- (0,50) [1] 119 <- (0,169) [1] 120 <- (0,246) [1] 121 <- (0,247) [1] 122 <- (0,9) [1] 123 <- (0,30) [1] 124 <- (0,135) [1] 125 <- (0,148) [1] 126 <- (0,184) [1] 127 <- (0,189) [1] 128 <- (0,229) [1] 129 <- (0,277) [1] 130 <- (0,320) [1] 131 <- (0,352) [1] 132 <- (0,393) [1] 133 <- (0,394) [1] 134 <- (0,398) [1] 135 <- (0,399) [1] 136 <- (0,400) [1] 137 <- (0,401) [1] 138 <- (0,403) [1] 139 <- (0,404) [1] 140 <- (0,406) [1] 141 <- (0,407) [1] 142 <- (0,408) [1] 143 <- (0,409) [1] 144 <- (0,410) [1] 145 <- (0,411) [1] 146 <- (0,424) [1] 147 <- (0,425) [1] 148 <- (0,426) [1] 149 <- (0,427) [1] 150 <- (0,428) [1] 151 <- (0,455) [1] 152 <- (0,456) [1] 153 <- (0,457) [1] 154 <- (0,464) [1] 155 <- (0,466) [1] 156 <- (0,475) [1] 157 <- (0,484) [1] 158 <- (0,490) [1] 159 <- (0,496) [1] 160 <- (0,497) [1] 161 <- (0,501) [1] 162 <- (0,509) [1] 163 <- (0,510) [1] 164 <- (0,512) [1] 165 <- (0,515) [1] 166 <- (0,520) [1] 167 <- (0,534) [1] 168 <- (0,537) [1] 169 <- (0,544) [1] 170 <- (0,548) [1] 171 <- (0,549) [1] 172 <- (0,555) [1] 173 <- (0,556) [1] 174 <- (0,560) [1] 175 <- (0,562) [1] 176 <- (0,567) [1] 177 <- (0,570) [1] 178 <- (0,576) [1] 179 <- (0,577) [1] 180 <- (0,579) [1] 181 <- (0,586) [1] 182 <- (0,587) [1] 183 <- (0,588) [1] 184 <- (0,590) [1] 185 <- (0,596) [1] 186 <- (0,609) [1] 187 <- (0,612) [1] 188 <- (0,615) [1] 189 <- (0,405) [1] 190 <- (0,423) [1] 191 <- (0,471) [1] 192 <- (0,477) [1] 193 <- (0,522) [1] 194 <- (0,533) [1] 195 <- (0,600) [1] 196 <- (0,603) [1] 197 <- (0,613) [1] 198 <- (0,616) [1] 199 <- (0,470) [1] 200 <- (0,479) [1] 201 <- (0,521) [1] 202 <- (0,531) [1] 203 <- (0,604) [1] 204 <- (0,412) [1] 205 <- (0,413) [1] 206 <- (0,452) [1] 207 <- (0,453) [1] 208 <- (0,478) [1] 209 <- (0,488) [1] 210 <- (0,502) [1] 211 <- (0,517) [1] 212 <- (0,565) [1] 213 <- (0,568) [1] 214 <- (0,574) [1] 215 <- (0,598) [1] 216 <- (0,608) [2] Number of roots=0, leaves=215, remote ranks=1 [2] 0 <- (0,17) [2] 1 <- (0,22) [2] 2 <- (0,35) [2] 3 <- (0,38) [2] 4 <- (0,49) [2] 5 <- (0,50) [2] 6 <- (0,52) [2] 7 <- (0,54) [2] 8 <- (0,55) [2] 9 <- (0,58) [2] 10 <- (0,64) [2] 11 <- (0,66) [2] 12 <- (0,69) [2] 13 <- (0,80) [2] 14 <- (0,82) [2] 15 <- (0,89) [2] 16 <- (0,92) [2] 17 <- (0,97) [2] 18 <- (0,99) [2] 19 <- (0,105) [2] 20 <- (0,106) [2] 21 <- (0,107) [2] 22 <- (0,109) [2] 23 <- (0,110) [2] 24 <- (0,117) [2] 25 <- (0,119) [2] 26 <- (0,126) [2] 27 <- (0,131) [2] 28 <- (0,139) [2] 29 <- (0,140) [2] 30 <- (0,146) [2] 31 <- (0,149) [2] 32 <- (0,150) [2] 33 <- (0,153) [2] 34 <- (0,155) [2] 35 <- (0,156) [2] 36 <- (0,157) [2] 37 <- (0,162) [2] 38 <- (0,169) [2] 39 <- (0,171) [2] 40 <- (0,177) [2] 41 <- (0,182) [2] 42 <- (0,188) [2] 43 <- (0,193) [2] 44 <- (0,194) [2] 45 <- (0,195) [2] 46 <- (0,196) [2] 47 <- (0,197) [2] 48 <- (0,199) [2] 49 <- (0,200) [2] 50 <- (0,201) [2] 51 <- (0,202) [2] 52 <- (0,205) [2] 53 <- (0,206) [2] 54 <- (0,214) [2] 55 <- (0,215) [2] 56 <- (0,217) [2] 57 <- (0,219) [2] 58 <- (0,230) [2] 59 <- (0,232) [2] 60 <- (0,234) [2] 61 <- (0,237) [2] 62 <- (0,238) [2] 63 <- (0,239) [2] 64 <- (0,242) [2] 65 <- (0,246) [2] 66 <- (0,247) [2] 67 <- (0,255) [2] 68 <- (0,257) [2] 69 <- (0,262) [2] 70 <- (0,263) [2] 71 <- (0,264) [2] 72 <- (0,270) [2] 73 <- (0,279) [2] 74 <- (0,286) [2] 75 <- (0,288) [2] 76 <- (0,289) [2] 77 <- (0,290) [2] 78 <- (0,300) [2] 79 <- (0,304) [2] 80 <- (0,305) [2] 81 <- (0,306) [2] 82 <- (0,307) [2] 83 <- (0,310) [2] 84 <- (0,311) [2] 85 <- (0,312) [2] 86 <- (0,314) [2] 87 <- (0,319) [2] 88 <- (0,322) [2] 89 <- (0,328) [2] 90 <- (0,348) [2] 91 <- (0,349) [2] 92 <- (0,358) [2] 93 <- (0,363) [2] 94 <- (0,367) [2] 95 <- (0,371) [2] 96 <- (0,374) [2] 97 <- (0,376) [2] 98 <- (0,74) [2] 99 <- (0,86) [2] 100 <- (0,95) [2] 101 <- (0,160) [2] 102 <- (0,213) [2] 103 <- (0,225) [2] 104 <- (0,231) [2] 105 <- (0,298) [2] 106 <- (0,345) [2] 107 <- (0,355) [2] 108 <- (0,356) [2] 109 <- (0,386) [2] 110 <- (0,70) [2] 111 <- (0,104) [2] 112 <- (0,136) [2] 113 <- (0,198) [2] 114 <- (0,8) [2] 115 <- (0,13) [2] 116 <- (0,91) [2] 117 <- (0,115) [2] 118 <- (0,122) [2] 119 <- (0,134) [2] 120 <- (0,164) [2] 121 <- (0,172) [2] 122 <- (0,301) [2] 123 <- (0,320) [2] 124 <- (0,325) [2] 125 <- (0,326) [2] 126 <- (0,329) [2] 127 <- (0,330) [2] 128 <- (0,334) [2] 129 <- (0,352) [2] 130 <- (0,357) [2] 131 <- (0,393) [2] 132 <- (0,396) [2] 133 <- (0,429) [2] 134 <- (0,430) [2] 135 <- (0,431) [2] 136 <- (0,432) [2] 137 <- (0,433) [2] 138 <- (0,434) [2] 139 <- (0,435) [2] 140 <- (0,436) [2] 141 <- (0,448) [2] 142 <- (0,449) [2] 143 <- (0,450) [2] 144 <- (0,451) [2] 145 <- (0,458) [2] 146 <- (0,459) [2] 147 <- (0,467) [2] 148 <- (0,468) [2] 149 <- (0,470) [2] 150 <- (0,472) [2] 151 <- (0,473) [2] 152 <- (0,476) [2] 153 <- (0,479) [2] 154 <- (0,481) [2] 155 <- (0,482) [2] 156 <- (0,495) [2] 157 <- (0,503) [2] 158 <- (0,505) [2] 159 <- (0,511) [2] 160 <- (0,516) [2] 161 <- (0,519) [2] 162 <- (0,521) [2] 163 <- (0,523) [2] 164 <- (0,524) [2] 165 <- (0,528) [2] 166 <- (0,530) [2] 167 <- (0,531) [2] 168 <- (0,536) [2] 169 <- (0,539) [2] 170 <- (0,540) [2] 171 <- (0,541) [2] 172 <- (0,546) [2] 173 <- (0,547) [2] 174 <- (0,550) [2] 175 <- (0,554) [2] 176 <- (0,559) [2] 177 <- (0,561) [2] 178 <- (0,566) [2] 179 <- (0,578) [2] 180 <- (0,580) [2] 181 <- (0,582) [2] 182 <- (0,592) [2] 183 <- (0,597) [2] 184 <- (0,601) [2] 185 <- (0,604) [2] 186 <- (0,614) [2] 187 <- (0,437) [2] 188 <- (0,460) [2] 189 <- (0,506) [2] 190 <- (0,514) [2] 191 <- (0,600) [2] 192 <- (0,606) [2] 193 <- (0,401) [2] 194 <- (0,520) [2] 195 <- (0,402) [2] 196 <- (0,446) [2] 197 <- (0,447) [2] 198 <- (0,452) [2] 199 <- (0,453) [2] 200 <- (0,454) [2] 201 <- (0,463) [2] 202 <- (0,483) [2] 203 <- (0,504) [2] 204 <- (0,507) [2] 205 <- (0,526) [2] 206 <- (0,527) [2] 207 <- (0,529) [2] 208 <- (0,552) [2] 209 <- (0,568) [2] 210 <- (0,574) [2] 211 <- (0,599) [2] 212 <- (0,608) [2] 213 <- (0,610) [2] 214 <- (0,611) [3] Number of roots=0, leaves=206, remote ranks=1 [3] 0 <- (0,2) [3] 1 <- (0,3) [3] 2 <- (0,7) [3] 3 <- (0,8) [3] 4 <- (0,9) [3] 5 <- (0,10) [3] 6 <- (0,13) [3] 7 <- (0,15) [3] 8 <- (0,16) [3] 9 <- (0,23) [3] 10 <- (0,24) [3] 11 <- (0,30) [3] 12 <- (0,36) [3] 13 <- (0,45) [3] 14 <- (0,46) [3] 15 <- (0,51) [3] 16 <- (0,53) [3] 17 <- (0,65) [3] 18 <- (0,67) [3] 19 <- (0,68) [3] 20 <- (0,72) [3] 21 <- (0,73) [3] 22 <- (0,77) [3] 23 <- (0,79) [3] 24 <- (0,84) [3] 25 <- (0,91) [3] 26 <- (0,101) [3] 27 <- (0,102) [3] 28 <- (0,112) [3] 29 <- (0,114) [3] 30 <- (0,115) [3] 31 <- (0,116) [3] 32 <- (0,120) [3] 33 <- (0,122) [3] 34 <- (0,123) [3] 35 <- (0,127) [3] 36 <- (0,134) [3] 37 <- (0,135) [3] 38 <- (0,137) [3] 39 <- (0,138) [3] 40 <- (0,141) [3] 41 <- (0,143) [3] 42 <- (0,144) [3] 43 <- (0,148) [3] 44 <- (0,158) [3] 45 <- (0,159) [3] 46 <- (0,164) [3] 47 <- (0,167) [3] 48 <- (0,168) [3] 49 <- (0,170) [3] 50 <- (0,172) [3] 51 <- (0,173) [3] 52 <- (0,174) [3] 53 <- (0,175) [3] 54 <- (0,179) [3] 55 <- (0,183) [3] 56 <- (0,184) [3] 57 <- (0,185) [3] 58 <- (0,189) [3] 59 <- (0,191) [3] 60 <- (0,192) [3] 61 <- (0,203) [3] 62 <- (0,209) [3] 63 <- (0,221) [3] 64 <- (0,222) [3] 65 <- (0,226) [3] 66 <- (0,229) [3] 67 <- (0,235) [3] 68 <- (0,236) [3] 69 <- (0,251) [3] 70 <- (0,252) [3] 71 <- (0,254) [3] 72 <- (0,259) [3] 73 <- (0,265) [3] 74 <- (0,266) [3] 75 <- (0,268) [3] 76 <- (0,271) [3] 77 <- (0,272) [3] 78 <- (0,276) [3] 79 <- (0,277) [3] 80 <- (0,293) [3] 81 <- (0,294) [3] 82 <- (0,297) [3] 83 <- (0,301) [3] 84 <- (0,302) [3] 85 <- (0,317) [3] 86 <- (0,320) [3] 87 <- (0,325) [3] 88 <- (0,326) [3] 89 <- (0,329) [3] 90 <- (0,330) [3] 91 <- (0,333) [3] 92 <- (0,334) [3] 93 <- (0,352) [3] 94 <- (0,357) [3] 95 <- (0,372) [3] 96 <- (0,383) [3] 97 <- (0,389) [3] 98 <- (0,393) [3] 99 <- (0,59) [3] 100 <- (0,121) [3] 101 <- (0,136) [3] 102 <- (0,163) [3] 103 <- (0,181) [3] 104 <- (0,186) [3] 105 <- (0,187) [3] 106 <- (0,198) [3] 107 <- (0,228) [3] 108 <- (0,299) [3] 109 <- (0,64) [3] 110 <- (0,80) [3] 111 <- (0,82) [3] 112 <- (0,105) [3] 113 <- (0,109) [3] 114 <- (0,110) [3] 115 <- (0,117) [3] 116 <- (0,139) [3] 117 <- (0,153) [3] 118 <- (0,193) [3] 119 <- (0,202) [3] 120 <- (0,206) [3] 121 <- (0,219) [3] 122 <- (0,246) [3] 123 <- (0,247) [3] 124 <- (0,304) [3] 125 <- (0,305) [3] 126 <- (0,367) [3] 127 <- (0,395) [3] 128 <- (0,402) [3] 129 <- (0,412) [3] 130 <- (0,413) [3] 131 <- (0,414) [3] 132 <- (0,415) [3] 133 <- (0,416) [3] 134 <- (0,417) [3] 135 <- (0,418) [3] 136 <- (0,419) [3] 137 <- (0,443) [3] 138 <- (0,444) [3] 139 <- (0,445) [3] 140 <- (0,446) [3] 141 <- (0,447) [3] 142 <- (0,452) [3] 143 <- (0,453) [3] 144 <- (0,454) [3] 145 <- (0,462) [3] 146 <- (0,463) [3] 147 <- (0,465) [3] 148 <- (0,474) [3] 149 <- (0,478) [3] 150 <- (0,480) [3] 151 <- (0,483) [3] 152 <- (0,486) [3] 153 <- (0,488) [3] 154 <- (0,489) [3] 155 <- (0,491) [3] 156 <- (0,499) [3] 157 <- (0,500) [3] 158 <- (0,502) [3] 159 <- (0,504) [3] 160 <- (0,507) [3] 161 <- (0,508) [3] 162 <- (0,513) [3] 163 <- (0,517) [3] 164 <- (0,518) [3] 165 <- (0,526) [3] 166 <- (0,527) [3] 167 <- (0,529) [3] 168 <- (0,535) [3] 169 <- (0,543) [3] 170 <- (0,552) [3] 171 <- (0,553) [3] 172 <- (0,557) [3] 173 <- (0,558) [3] 174 <- (0,563) [3] 175 <- (0,565) [3] 176 <- (0,568) [3] 177 <- (0,569) [3] 178 <- (0,572) [3] 179 <- (0,574) [3] 180 <- (0,575) [3] 181 <- (0,581) [3] 182 <- (0,583) [3] 183 <- (0,589) [3] 184 <- (0,595) [3] 185 <- (0,598) [3] 186 <- (0,599) [3] 187 <- (0,602) [3] 188 <- (0,608) [3] 189 <- (0,610) [3] 190 <- (0,611) [3] 191 <- (0,401) [3] 192 <- (0,411) [3] 193 <- (0,464) [3] 194 <- (0,556) [3] 195 <- (0,567) [3] 196 <- (0,448) [3] 197 <- (0,468) [3] 198 <- (0,476) [3] 199 <- (0,479) [3] 200 <- (0,519) [3] 201 <- (0,521) [3] 202 <- (0,530) [3] 203 <- (0,540) [3] 204 <- (0,561) [3] 205 <- (0,582) vector view - natural order Vec Object: 4 MPI processes type: mpi Process [0] 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. Process [1] 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. Process [2] 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. Process [3] 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. Set DMPlex local to global mapping, done Force to stop -------------- next part -------------- # vtk DataFile Version 2.0 unstructured-mesh, Created by Gmsh ASCII DATASET UNSTRUCTURED_GRID POINTS 223 double 0 0 0 3 0 0 3 0 2 0 0 2 0.2 0 0.8 0.4 0 0.8 0.8 0 0.8 1.4 0 0.8 2.2 0 0.8 0.8 0 0.6 0.8 0 1 0.8 0 1.2 0.1999999999991203 0 0 0.3999999999983593 0 0 0.5999999999974546 0 0 0.799999999996742 0 0 0.9999999999960294 0 0 1.199999999995317 0 0 1.399999999994604 0 0 1.599999999994632 0 0 1.799999999995399 0 0 1.999999999996166 0 0 2.199999999996932 0 0 2.3999999999977 0 0 2.599999999998466 0 0 2.799999999999233 0 0 0 0 1.799999999999167 0 0 1.6 0 0 1.400000000001387 0 0 1.200000000002774 0 0 1.000000000004117 0 0 0.8000000000033287 0 0 0.6000000000024965 0 0 0.4000000000016644 0 0 0.200000000000832 2.800000000002174 0 2 2.600000000002023 0 2 2.400000000003 0 2 2.200000000002438 0 2 2.000000000001934 0 2 1.800000000002824 0 2 1.600000000003743 0 2 1.400000000003922 0 2 1.200000000003361 0 2 1.000000000002801 0 2 0.8000000000022411 0 2 0.600000000001681 0 2 0.4000000000011208 0 2 0.2000000000005602 0 2 3 0 0.1999999999996293 3 0 0.3999999999991157 3 0 0.5999999999985328 3 0 0.7999999999979498 3 0 0.9999999999973885 3 0 1.199999999997894 3 0 1.39999999999842 3 0 1.599999999998947 3 0 1.799999999999474 1.599999999999545 0 0.8 1.799999999998915 0 0.8 1.999999999999414 0 0.8 0.999999999999776 0 0.8 1.199999999999581 0 0.8 0.5999999999994576 0 0.8 2.379214504531294 0 1.405373076248138 1.704991835910612 0 1.402778606292781 1.23408069164269 0 1.458669309885507 0.4685665670601734 0 1.508019509283279 2.497738191218041 0 0.4349897042973965 2.610303185213958 0 0.9878017997043026 1.296692364559492 0 0.3986179228025756 1.909153318464776 0 0.4063140589542637 0.4623896075858526 0 0.3881084953993027 2.05931870173157 0 1.584337375247877 2.038721804512762 0 1.194360902254602 0.837837785556555 0 1.66744985915262 1.397714914184593 0 1.148692746366346 0.3742908223262418 0 1.174082908003071 2.665804146696016 0 1.663931138500056 1.497313596493072 0 1.673656798244939 2.227235772357356 0 0.3329268292685604 0.9938576986066314 0 0.3080338817095 2.697653501978651 0 1.292297570997389 1.096142514170803 0 1.1 1.605753072669749 0 0.491762523868872 1.69999999999923 0 1.091666666666887 2.71251200913381 0 0.7161239977234372 2.351344489130117 0 1.699121968393499 1.782173742909407 0 1.712990171238928 2.280062762525843 0 1.050264328024126 0.705769497964517 0 0.2623692964269909 0.2854025199473142 0 1.729673118535918 2.715992888350279 0 0.259203247264699 1.065208746302501 0 1.729902382621085 1.691666666666294 0 0.2437499999987736 2.440212763582161 0 0.7106750276238525 1.100121885885791 0 0.5682290676564412 2.10438472624376 0 0.5771286151995686 0.9442269078216747 0 1.355912991197506 0.6163286784502398 0 1.742953710918845 0.2780048027011408 0 1.402161836324629 1.478220357618803 0 1.470719894825082 0.2471457398099939 0 0.2524810787590218 0.2487655548760474 0 0.5238310750572215 0.6264666351387723 0 1.347553521562679 2.490235690235405 0 0.2265993265990937 2.803830918769202 0 0.526638964852175 0.5594040508164777 0 1.036741482837054 1.457982520923784 0 0.2269956302319235 1.937737252573877 0 1.391730183630887 2.796228535856523 0 1.102723737365208 2.159147416270142 0 1.403608979873389 1.281872426241454 0 1.800902035832503 1.8837415199635 0 1.003577707586214 2.03680937116415 0 0.2433849188490117 0.5795454545454293 0 0.5795454545453738 1.170008074655726 0 0.2233372172980658 2.801404909550425 0 1.492049198199022 0.2177437641725037 0 1.017743764172138 2.323894267418511 0 0.5113033981177416 1.172046878761573 0 1.270534731754538 0.5485564404307722 0 0.1918013452245567 2.083695878040242 0 1.807197232140322 1.416074624101432 0 0.5774396173238155 1.785479734433667 0 0.5752957350000555 2.501649089227999 0 1.271931709598009 1.284479965824385 0 0.9903690252861073 1.523327777985623 0 1.000647820252583 0.5364474308195638 0 1.217014743936919 2.789881440617124 0 1.789881440617261 2.507494844449481 0 1.809810027645477 0.9231700938715783 0 1.851873008429856 2.111938913072513 0 0.9830530884882849 2.793154307858761 0 0.8768075521287002 2.587000975789033 0 1.453818519623165 2.394803322025814 0 0.8926720437521438 1.834919049360319 0 1.223481514480617 1.573104508099891 0 1.238404636402246 0.4475743170520548 0 1.838276232017855 0.1815881432411806 0 1.217598300523445 0.8672650147563506 0 0.1762453176299749 2.306587108970027 0 0.1705150816205509 1.625410855356525 0 1.780289239734096 0.8233833894553626 0 0.3832672040340161 0.1696953830832089 0 1.570979889137452 1.65389327768683 0 1.587851501742204 2.261020253229125 0 1.250190685669392 1.892425939574817 0 1.84614964303553 1.024101021611565 0 1.523932646626128 1.865603535956367 0 0.205257814827333 0.3933113884489506 0 0.9690646432006123 0.4507987812721571 0 1.338822405420034 2.274894458806367 0 1.544832506227184 1.838053030422587 0 1.554838056794376 0.3686859183416712 0 0.61430085724029 0.9738015786914841 0 0.9738015786915135 1.332511029850933 0 1.615932245360217 0.9894053075619311 0 1.229077937292363 2.577637560506477 0 0.8159297007077034 2.375571518833167 0 0.3445824808775385 2.476279702749816 0 1.597328424196379 0.1460062754235328 0 0.3970029733771729 1.265344920154252 0 0.6329578502716723 1.935835647926898 0 0.6344934670553749 2.069281594968028 0 0.3940954484269742 2.29506823171605 0 1.851926700449395 0.6361007615483169 0 0.4081024174896069 2.863891877583948 0 1.299999999998157 1.121879385586257 0 0.3968719495845081 2.866792480267094 0 0.6999999999982413 0.640716822374706 0 1.564448591963968 1.465065170338345 0 0.3920979873954564 1.461565556697408 0 1.846878377665404 1.313202543621524 0 0.1520409031873539 1.700610872801345 0 0.9324817812131065 1.771531265638171 0 0.3844760265298595 0.9316739727448209 0 0.6501584876562788 0.1553841991341328 0 1.844615800865806 2.83194205855847 0 0.1548176119178121 0.7554673115764591 0 1.852455315700264 1.543297705630606 0 0.6692863615262314 2.573954635025233 0 0.5925173061601432 1.116582444085043 0 0.9359264074853757 1.025953898186142 0 0.1576523269573968 2.847994284561111 0 1.642990843504484 0.9600149914842826 0 0.4894980205592772 2.205556830158146 0 1.69321441508239 2.627328673572279 0 0.1531224420159892 2.642687904572294 0 1.141659092120952 2.834577319798835 0 0.3225122389814655 0.2983336950301236 0 1.882071991275122 0.8122248620838787 0 1.44321626841715 0.6996996928159201 0 0.1181368787597825 0.1444936133652446 0 0.1476295673217159 0.1529060387700887 0 0.638313084543821 2.293874040543968 0 0.6803984596738974 0.3457826007705083 0 0.1617330481724948 0.1220964204329312 0 1.709053761707669 2.657398203846927 0 1.861014030866323 0.3295446886489339 0 1.593181052285239 1.098772336102051 0 1.885940614691867 2.652721607232833 0 0.3937434646924954 0.6818782899869689 0 0.9475655802543401 1.931167058145805 0 1.69635186532083 1.581219843306312 0 0.1461740902663257 2.447705348719867 0 1.095025095160731 1.257592110345052 0 1.147316202898446 1.757518958348691 0 1.860358589513101 2.155116155877665 0 0.1392252118453306 0.6673660677936304 0 1.124812554765165 1.35385707705373 0 1.313046155723573 0.4646889322555703 0 1.662758702500851 1.187554783129829 0 1.625867724065088 1.077557767903738 0 1.371623591692298 1.680167363715666 0 0.6787096877578693 0.6812821612349281 0 0.6812821612351486 2.246701249406042 0 0.9314973650661384 2.169432754969319 0 1.119127586148288 0.1164067881991524 0 0.916406788200873 0.1014459250697099 0 1.100000000003446 2.890346327029697 0 1.890346327028477 0.3102787716159667 0 0.3755545051061105 0.9318698800848435 0 1.100575903196775 CELLS 469 1789 1 0 1 1 1 2 1 3 1 4 1 5 1 6 1 7 1 8 1 9 1 10 1 11 2 0 12 2 12 13 2 13 14 2 14 15 2 15 16 2 16 17 2 17 18 2 18 19 2 19 20 2 20 21 2 21 22 2 22 23 2 23 24 2 24 25 2 25 1 2 3 26 2 26 27 2 27 28 2 28 29 2 29 30 2 30 31 2 31 32 2 32 33 2 33 34 2 34 0 2 2 35 2 35 36 2 36 37 2 37 38 2 38 39 2 39 40 2 40 41 2 41 42 2 42 43 2 43 44 2 44 45 2 45 46 2 46 47 2 47 48 2 48 3 2 1 49 2 49 50 2 50 51 2 51 52 2 52 53 2 53 54 2 54 55 2 55 56 2 56 57 2 57 2 2 7 58 2 58 59 2 59 60 2 60 8 2 6 61 2 61 62 2 62 7 2 5 63 2 63 6 2 4 5 2 6 9 2 10 6 2 11 10 3 209 104 128 3 103 154 194 3 164 80 119 3 97 164 119 3 11 104 209 3 194 154 4 3 61 96 62 3 8 60 97 3 69 158 133 3 84 171 94 3 187 23 24 3 63 107 150 3 28 139 100 3 69 133 110 3 67 151 104 3 20 94 204 3 105 23 187 3 79 101 145 3 11 98 191 3 103 194 161 3 102 161 193 3 75 93 131 3 65 145 101 3 95 119 181 3 181 119 68 3 194 32 161 3 193 161 34 3 5 63 150 3 4 154 5 3 121 72 196 3 94 171 108 3 75 148 93 3 72 221 196 3 11 191 104 3 46 138 99 3 65 137 136 3 84 94 175 3 75 179 99 3 74 111 109 3 28 100 144 3 61 176 96 3 104 151 128 3 103 221 154 3 70 162 96 3 62 96 162 3 20 149 94 3 71 97 163 3 221 72 154 3 4 150 118 3 85 137 127 3 76 127 137 3 60 163 97 3 85 136 137 3 19 20 204 3 64 125 134 3 82 134 125 3 4 5 150 3 70 96 168 3 73 109 111 3 70 173 108 3 63 154 115 3 28 29 139 3 32 33 161 3 33 34 161 3 125 146 205 3 71 175 149 3 65 101 137 3 71 164 97 3 50 51 106 3 79 156 101 3 7 127 126 3 9 115 166 3 92 201 105 3 68 105 201 3 66 101 156 3 70 116 173 3 9 166 143 3 94 149 175 3 5 154 63 3 8 97 195 3 53 54 110 3 75 99 170 3 89 205 146 3 11 157 98 3 86 133 158 3 46 99 179 3 66 210 101 3 75 131 179 3 90 166 121 3 73 111 152 3 72 121 166 3 59 60 113 3 74 146 111 3 77 107 128 3 77 100 139 3 42 43 112 3 77 151 100 3 65 136 109 3 67 199 100 3 64 146 125 3 67 100 151 3 8 195 135 3 86 181 106 3 83 120 157 3 76 126 127 3 74 113 132 3 78 134 117 3 82 117 134 3 27 28 144 3 85 113 136 3 74 136 113 3 46 47 138 3 92 105 187 3 67 104 170 3 23 105 141 3 53 110 133 3 106 181 201 3 54 167 110 3 120 213 157 3 55 56 117 3 71 163 124 3 70 123 162 3 60 132 113 3 95 135 195 3 6 61 155 3 6 155 10 3 73 153 109 3 71 149 114 3 7 126 62 3 102 196 221 3 63 202 107 3 73 152 186 3 90 140 143 3 81 143 140 3 59 113 174 3 18 204 108 3 7 58 127 3 51 169 106 3 86 106 169 3 82 110 167 3 36 37 130 3 21 114 149 3 72 115 154 3 50 106 189 3 68 159 105 3 17 116 183 3 74 109 136 3 13 14 121 3 94 108 204 3 38 39 122 3 101 210 137 3 43 200 112 3 75 170 191 3 85 174 113 3 77 150 107 3 37 165 130 3 88 142 145 3 79 145 142 3 25 178 187 3 106 201 189 3 66 120 210 3 93 112 200 3 64 160 152 3 7 162 123 3 8 132 60 3 16 17 183 3 44 45 131 3 59 124 163 3 92 189 201 3 76 137 210 3 84 124 214 3 65 109 153 3 52 53 133 3 68 201 181 3 8 135 216 3 105 159 141 3 104 191 170 3 64 134 160 3 77 139 118 3 97 119 195 3 15 16 140 3 7 123 180 3 87 130 165 3 20 21 149 3 18 19 204 3 24 25 187 3 18 108 173 3 70 108 171 3 64 111 146 3 84 214 180 3 7 62 162 3 80 141 159 3 59 163 60 3 69 110 188 3 87 160 130 3 78 130 160 3 64 152 111 3 57 129 184 3 7 180 58 3 82 167 117 3 55 117 167 3 38 122 165 3 82 188 110 3 21 208 114 3 77 118 150 3 78 129 198 3 74 217 146 3 75 191 148 3 90 121 192 3 80 114 208 3 12 196 193 3 6 176 61 3 6 9 176 3 79 112 156 3 78 117 184 3 78 198 130 3 44 131 200 3 122 186 165 3 14 192 121 3 74 132 217 3 3 26 177 3 1 49 178 3 1 178 25 3 3 177 48 3 93 212 112 3 79 172 112 3 68 119 159 3 93 200 131 3 70 171 123 3 84 123 171 3 56 57 184 3 42 112 172 3 73 122 203 3 77 128 151 3 73 203 153 3 71 114 164 3 80 164 114 3 39 147 122 3 39 40 147 3 56 184 117 3 98 148 191 3 91 211 138 3 78 184 129 3 70 168 116 3 81 116 168 3 17 173 116 3 85 127 174 3 58 174 127 3 63 115 215 3 9 215 115 3 91 199 211 3 84 175 124 3 71 124 175 3 72 166 115 3 80 159 119 3 40 41 207 3 81 183 116 3 41 42 172 3 12 13 196 3 22 23 141 3 4 118 218 3 16 183 140 3 35 198 129 3 87 165 186 3 41 142 207 3 52 133 169 3 86 169 133 3 45 46 179 3 21 22 208 3 98 157 213 3 41 172 142 3 95 195 119 3 92 187 178 3 66 213 120 3 102 193 196 3 100 199 144 3 59 214 124 3 84 180 123 3 61 62 182 3 82 125 188 3 83 206 120 3 10 209 202 3 31 32 194 3 4 31 194 3 0 193 34 3 0 12 193 3 35 36 198 3 13 121 196 3 73 186 122 3 122 147 203 3 87 152 160 3 83 182 126 3 62 126 182 3 86 158 181 3 95 181 158 3 90 143 166 3 83 126 206 3 80 208 141 3 76 206 126 3 17 18 173 3 65 153 145 3 69 135 158 3 95 158 135 3 107 209 128 3 69 188 205 3 125 205 188 3 35 129 220 3 57 220 129 3 10 11 209 3 9 143 185 3 37 38 165 3 78 160 134 3 54 55 167 3 45 179 131 3 36 130 198 3 6 202 63 3 6 10 202 3 51 52 169 3 107 202 209 3 88 145 153 3 58 59 174 3 61 182 155 3 79 142 172 3 66 148 213 3 98 213 148 3 89 132 216 3 8 216 132 3 9 185 176 3 40 207 147 3 69 205 135 3 89 135 205 3 6 63 215 3 6 215 9 3 49 50 189 3 89 217 132 3 81 140 183 3 81 168 185 3 96 185 168 3 47 48 190 3 118 139 219 3 30 31 218 3 4 218 31 3 14 15 192 3 15 140 192 3 90 192 140 3 112 212 156 3 26 27 197 3 43 44 200 3 2 220 57 3 2 35 220 3 83 222 155 3 10 222 11 3 58 214 59 3 29 30 219 3 83 157 222 3 76 210 206 3 120 206 210 3 89 216 135 3 87 186 152 3 103 161 221 3 91 138 190 3 81 185 143 3 47 190 138 3 88 153 203 3 66 212 148 3 93 148 212 3 91 197 144 3 89 146 217 3 99 138 211 3 29 219 139 3 83 155 182 3 88 207 142 3 22 141 208 3 27 144 197 3 88 147 207 3 91 144 199 3 88 203 147 3 67 170 211 3 99 211 170 3 30 218 219 3 118 219 218 3 67 211 199 3 96 176 185 3 49 189 178 3 91 177 197 3 48 177 190 3 66 156 212 3 91 190 177 3 10 155 222 3 92 178 189 3 11 222 157 3 26 197 177 3 102 221 161 3 58 180 214 CELL_TYPES 469 1 1 1 1 1 1 1 1 1 1 1 1 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 From huq2090 at gmail.com Thu Nov 29 13:37:41 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Thu, 29 Nov 2018 13:37:41 -0600 Subject: [petsc-users] Example 23 of ksp problems In-Reply-To: <3EB7510E-0E3A-4413-8E36-C23855E140B1@anl.gov> References: <3EB7510E-0E3A-4413-8E36-C23855E140B1@anl.gov> Message-ID: Thanks! Sincerely, Huq On Tue, Nov 27, 2018 at 10:57 AM Smith, Barry F. wrote: > > This is an ad hoc way of checking that the error from the linear solve > is "reasonable". Note that PETSC_MACHINE_EPSILON > depends on the precision of the floating point used so for more precise > floating point (in going from half precision to quad precision) we expect > the error > to be smaller (depending on the machine epsilon). The 1000 is an arbitrary > number, it could be 100 or 10,000 or anything in between could be > reasonable. Note that using a 1.0 is probably not reasonable because it is > not reasonable to expect the error to be only a machine epsilon away from > the "exact" answer. > > Barry > > If you are not familiar with the term machine epsilon you can google it. > > > > > On Nov 26, 2018, at 9:30 PM, Fazlul Huq via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Hello PETSc developers, > > > > I went through the ex23.c of ksp section attached herewith but I don't > understand the following part: > > *********************************************** > > tol=1000.*PETSC_MACHINE_EPSILON > > ************************************************ > > and, > > ************************************************ > > ierr = VecAXPY(x,-1.0,u);CHKERRQ(ierr); > > ierr = VecNorm(x,NORM_2,&norm);CHKERRQ(ierr); > > ierr = KSPGetIterationNumber(ksp,&its);CHKERRQ(ierr); > > if (norm > tol) { > > ierr = PetscPrintf(PETSC_COMM_WORLD,"Norm of error %g, Iterations > %D\n", (double)norm,its);CHKERRQ(ierr); > > } > > ************************************************ > > I don't understand what is "tol" here and "*PETSC_MACHINE_EPSILON"? > > The if condition is also not clear to me. > > > > Thanks. > > Sincerely, > > Huq > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > > > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Thu Nov 29 13:46:19 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Thu, 29 Nov 2018 13:46:19 -0600 Subject: [petsc-users] Problem with large grid size Message-ID: Hello PETSc Developers, I am trying to run the code (attached herewith) with the following command and it works until the size of the matrix is 99999X99999. But when I try to run with 999999X999999 then I got weird result. The command is: ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg -ksp_view_solution Any suggestions is appreciated. Thanks. Sincerely, Huq -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: poisson_m.c Type: text/x-csrc Size: 9820 bytes Desc: not available URL: From bsmith at mcs.anl.gov Thu Nov 29 13:51:30 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Thu, 29 Nov 2018 19:51:30 +0000 Subject: [petsc-users] Problem with large grid size In-Reply-To: References: Message-ID: > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users wrote: > > Hello PETSc Developers, > > I am trying to run the code (attached herewith) with the following command and it works until the size of the matrix is 99999X99999. But when I try to run with 999999X999999 then I got weird result. What is that "weird result"? My guess is for problems that large you need to ./configure PETSc with the additional option --with-64-bit-indices since for such large problems 32 bit integers are not large enough to contain values needed for storing and accessing the sparse matrix. Barry > > The command is: > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg -ksp_view_solution > Any suggestions is appreciated. > > Thanks. > Sincerely, > Huq > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > From sajidsyed2021 at u.northwestern.edu Thu Nov 29 16:50:34 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Thu, 29 Nov 2018 16:50:34 -0600 Subject: [petsc-users] Some clarifications about TS ex3.c In-Reply-To: <37F3B68A-21BC-4CED-9394-4CD6FC8D4844@anl.gov> References: <37F3B68A-21BC-4CED-9394-4CD6FC8D4844@anl.gov> Message-ID: Another question I want to ask is about dttol that is used in the Monitor function as follows : 417: dttol = .0001; 418: PetscOptionsGetReal(NULL,NULL,"-dttol",&dttol,NULL); 419: if (dt < dttol) { 420: dt *= .999; 421: TSSetTimeStep(ts,dt); 422: } Is this decreasing dt if it's less than tolerance ? I commented out the norm calculation (lines 404-415) the time is stuck at 0 ( as per the output of -ts_monitor). But if I comment out the above lines (417-422) as well, everything works fine. What explains this behaviour ? Thank You! Sajid Ali Applied Physics Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Nov 29 17:41:53 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Thu, 29 Nov 2018 23:41:53 +0000 Subject: [petsc-users] Some clarifications about TS ex3.c In-Reply-To: References: <37F3B68A-21BC-4CED-9394-4CD6FC8D4844@anl.gov> Message-ID: > On Nov 29, 2018, at 4:50 PM, Sajid Ali wrote: > > Another question I want to ask is about dttol that is used in the Monitor function as follows : > > 417: dttol = .0001; > 418: PetscOptionsGetReal(NULL,NULL,"-dttol",&dttol,NULL); > 419: if (dt < dttol) { > 420: dt *= .999; > 421: TSSetTimeStep(ts,dt); > 422: } > > Is this decreasing dt if it's less than tolerance ? Yes, note this is arbitrary and there would never be a reason to change dt by .999; it is just to test that TSSetTimeStep() works. > > I commented out the norm calculation (lines 404-415) the time is stuck at 0 ( as per the output of -ts_monitor). This is because dt is never set (when you comment out those lines), and since it is never set it is likely zero hence less than dttol hence TSSetTimeStep() is called with a value of 0.0 which is why it gets "stuck" at 0.0; > But if I comment out the above lines (417-422) as well, everything works fine. What explains this behaviour ? > > Thank You! > Sajid Ali > Applied Physics > Northwestern University From danyang.su at gmail.com Thu Nov 29 18:39:23 2018 From: danyang.su at gmail.com (Danyang Su) Date: Thu, 29 Nov 2018 16:39:23 -0800 Subject: [petsc-users] Fwd: DMPlex global to natural problem using DmPlexGetVertexNumbering or DMPlexGlobalToNatural In-Reply-To: <16c44759-2ab2-e9ed-2c03-98f39738caa2@gmail.com> References: <16c44759-2ab2-e9ed-2c03-98f39738caa2@gmail.com> Message-ID: <9dda31a1-8ec6-f571-1ceb-ed4a23426bac@gmail.com> Dear PETSc developers & users, Sorry to bother you again. I just encounter some difficulties in DMPlex global to natural order. This is not the very necessary function in my code. But it's best to have it in case someone wants to feed code with initial conditions or parameters from external file using natural ordering. First I tried to use DmPlexGetVertexNumbering which seems pretty straightforward, but I always get error saying "You need a ISO C conforming compiler to use the glibc headers". I use Gfortran on Linux. Then I switched to use Label to save global-natural order before distributing the mesh, everything works fine. The only problem is that it takes a very long time during mesh distribution when the size of mesh is very big. Finally I tried to use GlobalToNatural ordering. Set Natural order to TRUE before calling DMPlexDistribute and then call DMPlexSetMigrationSF. However, the function DMPlexGlobalToNaturalEnd seems doing nothing as the returned vector is always unchanged. I got some help from Josh who has done similar work before. But I still cannot figure out what is wrong in the code. Attached is the code section with related functions included, together with a mesh and output of global vector and natural vector. If something is wrong in the code, it should be in these four functions. The related functions are called in the following order. call solver_dd_create_dmplex call solver_dd_mapping_set_dmplex call solver_dd_DMDACreate_flow !call solver_dd_DMDACreate_react ?? Not used in current testing call solver_dd_mapping_global_natural I really appreciate your help. Regards, Danyang -------------- next part -------------- A non-text attachment was scrubbed... Name: solver_ddmethod.F90 Type: text/x-fortran Size: 69352 bytes Desc: not available URL: -------------- next part -------------- Build local unstructured mesh, please wait ... Build unstructured mesh, done. vector view - petsc order Vec Object: 4 MPI processes type: mpi Process [0] 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. Process [1] 49. 50. 51. 52. 53. 54. 55. 56. 57. 58. 59. 60. 61. 62. 63. 64. 65. 66. 67. 68. 69. 70. 71. 72. 73. 74. 75. 76. 77. 78. 79. 80. 81. 82. 83. 84. 85. 86. 87. 88. 89. 90. 91. 92. 93. 94. 95. 96. 97. 98. 99. 100. 101. 102. 103. 104. Process [2] 105. 106. 107. 108. 109. 110. 111. 112. 113. 114. 115. 116. 117. 118. 119. 120. 121. 122. 123. 124. 125. 126. 127. 128. 129. 130. 131. 132. 133. 134. 135. 136. 137. 138. 139. 140. 141. 142. 143. 144. 145. 146. 147. 148. 149. 150. 151. 152. 153. 154. 155. 156. 157. 158. 159. Process [3] 160. 161. 162. 163. 164. 165. 166. 167. 168. 169. 170. 171. 172. 173. 174. 175. 176. 177. 178. 179. 180. 181. 182. 183. 184. 185. 186. 187. 188. 189. 190. 191. 192. 193. 194. 195. 196. 197. 198. 199. 200. 201. 202. 203. 204. 205. 206. 207. 208. 209. 210. 211. 212. 213. 214. 215. 216. 217. 218. 219. 220. 221. 222. 223. check migration SF PetscSF Object: 4 MPI processes type: basic sort=rank-order [0] Number of roots=617, leaves=208, remote ranks=1 [0] 0 <- (0,0) [0] 1 <- (0,4) [0] 2 <- (0,12) [0] 3 <- (0,14) [0] 4 <- (0,18) [0] 5 <- (0,21) [0] 6 <- (0,31) [0] 7 <- (0,33) [0] 8 <- (0,34) [0] 9 <- (0,37) [0] 10 <- (0,39) [0] 11 <- (0,41) [0] 12 <- (0,61) [0] 13 <- (0,74) [0] 14 <- (0,81) [0] 15 <- (0,83) [0] 16 <- (0,85) [0] 17 <- (0,86) [0] 18 <- (0,87) [0] 19 <- (0,93) [0] 20 <- (0,94) [0] 21 <- (0,95) [0] 22 <- (0,96) [0] 23 <- (0,98) [0] 24 <- (0,100) [0] 25 <- (0,103) [0] 26 <- (0,108) [0] 27 <- (0,111) [0] 28 <- (0,113) [0] 29 <- (0,118) [0] 30 <- (0,151) [0] 31 <- (0,152) [0] 32 <- (0,154) [0] 33 <- (0,160) [0] 34 <- (0,161) [0] 35 <- (0,166) [0] 36 <- (0,176) [0] 37 <- (0,178) [0] 38 <- (0,204) [0] 39 <- (0,207) [0] 40 <- (0,213) [0] 41 <- (0,216) [0] 42 <- (0,220) [0] 43 <- (0,223) [0] 44 <- (0,224) [0] 45 <- (0,225) [0] 46 <- (0,227) [0] 47 <- (0,231) [0] 48 <- (0,233) [0] 49 <- (0,240) [0] 50 <- (0,241) [0] 51 <- (0,250) [0] 52 <- (0,267) [0] 53 <- (0,269) [0] 54 <- (0,273) [0] 55 <- (0,275) [0] 56 <- (0,280) [0] 57 <- (0,281) [0] 58 <- (0,291) [0] 59 <- (0,296) [0] 60 <- (0,298) [0] 61 <- (0,303) [0] 62 <- (0,308) [0] 63 <- (0,313) [0] 64 <- (0,318) [0] 65 <- (0,323) [0] 66 <- (0,324) [0] 67 <- (0,338) [0] 68 <- (0,339) [0] 69 <- (0,345) [0] 70 <- (0,346) [0] 71 <- (0,347) [0] 72 <- (0,350) [0] 73 <- (0,351) [0] 74 <- (0,353) [0] 75 <- (0,354) [0] 76 <- (0,355) [0] 77 <- (0,356) [0] 78 <- (0,360) [0] 79 <- (0,362) [0] 80 <- (0,364) [0] 81 <- (0,365) [0] 82 <- (0,366) [0] 83 <- (0,368) [0] 84 <- (0,369) [0] 85 <- (0,370) [0] 86 <- (0,373) [0] 87 <- (0,375) [0] 88 <- (0,377) [0] 89 <- (0,378) [0] 90 <- (0,379) [0] 91 <- (0,380) [0] 92 <- (0,381) [0] 93 <- (0,384) [0] 94 <- (0,385) [0] 95 <- (0,386) [0] 96 <- (0,387) [0] 97 <- (0,388) [0] 98 <- (0,390) [0] 99 <- (0,391) [0] 100 <- (0,11) [0] 101 <- (0,27) [0] 102 <- (0,48) [0] 103 <- (0,56) [0] 104 <- (0,70) [0] 105 <- (0,104) [0] 106 <- (0,124) [0] 107 <- (0,125) [0] 108 <- (0,128) [0] 109 <- (0,130) [0] 110 <- (0,260) [0] 111 <- (0,278) [0] 112 <- (0,292) [0] 113 <- (0,315) [0] 114 <- (0,316) [0] 115 <- (0,321) [0] 116 <- (0,340) [0] 117 <- (0,341) [0] 118 <- (0,17) [0] 119 <- (0,22) [0] 120 <- (0,50) [0] 121 <- (0,66) [0] 122 <- (0,69) [0] 123 <- (0,150) [0] 124 <- (0,157) [0] 125 <- (0,169) [0] 126 <- (0,257) [0] 127 <- (0,270) [0] 128 <- (0,322) [0] 129 <- (0,397) [0] 130 <- (0,405) [0] 131 <- (0,420) [0] 132 <- (0,421) [0] 133 <- (0,422) [0] 134 <- (0,423) [0] 135 <- (0,437) [0] 136 <- (0,438) [0] 137 <- (0,439) [0] 138 <- (0,440) [0] 139 <- (0,441) [0] 140 <- (0,442) [0] 141 <- (0,460) [0] 142 <- (0,461) [0] 143 <- (0,469) [0] 144 <- (0,471) [0] 145 <- (0,477) [0] 146 <- (0,485) [0] 147 <- (0,487) [0] 148 <- (0,492) [0] 149 <- (0,493) [0] 150 <- (0,494) [0] 151 <- (0,498) [0] 152 <- (0,506) [0] 153 <- (0,514) [0] 154 <- (0,522) [0] 155 <- (0,525) [0] 156 <- (0,532) [0] 157 <- (0,533) [0] 158 <- (0,538) [0] 159 <- (0,542) [0] 160 <- (0,545) [0] 161 <- (0,551) [0] 162 <- (0,564) [0] 163 <- (0,571) [0] 164 <- (0,573) [0] 165 <- (0,584) [0] 166 <- (0,585) [0] 167 <- (0,591) [0] 168 <- (0,593) [0] 169 <- (0,594) [0] 170 <- (0,600) [0] 171 <- (0,603) [0] 172 <- (0,605) [0] 173 <- (0,606) [0] 174 <- (0,607) [0] 175 <- (0,613) [0] 176 <- (0,616) [0] 177 <- (0,398) [0] 178 <- (0,399) [0] 179 <- (0,400) [0] 180 <- (0,401) [0] 181 <- (0,404) [0] 182 <- (0,424) [0] 183 <- (0,425) [0] 184 <- (0,455) [0] 185 <- (0,456) [0] 186 <- (0,457) [0] 187 <- (0,501) [0] 188 <- (0,512) [0] 189 <- (0,520) [0] 190 <- (0,544) [0] 191 <- (0,549) [0] 192 <- (0,576) [0] 193 <- (0,596) [0] 194 <- (0,612) [0] 195 <- (0,435) [0] 196 <- (0,436) [0] 197 <- (0,459) [0] 198 <- (0,470) [0] 199 <- (0,473) [0] 200 <- (0,495) [0] 201 <- (0,521) [0] 202 <- (0,531) [0] 203 <- (0,536) [0] 204 <- (0,539) [0] 205 <- (0,550) [0] 206 <- (0,566) [0] 207 <- (0,604) [1] Number of roots=0, leaves=217, remote ranks=1 [1] 0 <- (0,1) [1] 1 <- (0,5) [1] 2 <- (0,6) [1] 3 <- (0,11) [1] 4 <- (0,19) [1] 5 <- (0,20) [1] 6 <- (0,25) [1] 7 <- (0,26) [1] 8 <- (0,27) [1] 9 <- (0,28) [1] 10 <- (0,29) [1] 11 <- (0,32) [1] 12 <- (0,40) [1] 13 <- (0,42) [1] 14 <- (0,43) [1] 15 <- (0,44) [1] 16 <- (0,47) [1] 17 <- (0,48) [1] 18 <- (0,56) [1] 19 <- (0,57) [1] 20 <- (0,59) [1] 21 <- (0,60) [1] 22 <- (0,62) [1] 23 <- (0,63) [1] 24 <- (0,70) [1] 25 <- (0,71) [1] 26 <- (0,75) [1] 27 <- (0,76) [1] 28 <- (0,78) [1] 29 <- (0,88) [1] 30 <- (0,90) [1] 31 <- (0,104) [1] 32 <- (0,121) [1] 33 <- (0,124) [1] 34 <- (0,125) [1] 35 <- (0,128) [1] 36 <- (0,129) [1] 37 <- (0,130) [1] 38 <- (0,132) [1] 39 <- (0,133) [1] 40 <- (0,136) [1] 41 <- (0,142) [1] 42 <- (0,145) [1] 43 <- (0,147) [1] 44 <- (0,163) [1] 45 <- (0,165) [1] 46 <- (0,180) [1] 47 <- (0,181) [1] 48 <- (0,186) [1] 49 <- (0,187) [1] 50 <- (0,190) [1] 51 <- (0,198) [1] 52 <- (0,208) [1] 53 <- (0,210) [1] 54 <- (0,211) [1] 55 <- (0,212) [1] 56 <- (0,218) [1] 57 <- (0,228) [1] 58 <- (0,243) [1] 59 <- (0,244) [1] 60 <- (0,245) [1] 61 <- (0,248) [1] 62 <- (0,249) [1] 63 <- (0,253) [1] 64 <- (0,256) [1] 65 <- (0,258) [1] 66 <- (0,260) [1] 67 <- (0,261) [1] 68 <- (0,274) [1] 69 <- (0,278) [1] 70 <- (0,282) [1] 71 <- (0,283) [1] 72 <- (0,284) [1] 73 <- (0,285) [1] 74 <- (0,287) [1] 75 <- (0,292) [1] 76 <- (0,295) [1] 77 <- (0,299) [1] 78 <- (0,309) [1] 79 <- (0,315) [1] 80 <- (0,316) [1] 81 <- (0,321) [1] 82 <- (0,327) [1] 83 <- (0,331) [1] 84 <- (0,332) [1] 85 <- (0,335) [1] 86 <- (0,336) [1] 87 <- (0,337) [1] 88 <- (0,340) [1] 89 <- (0,341) [1] 90 <- (0,342) [1] 91 <- (0,343) [1] 92 <- (0,344) [1] 93 <- (0,359) [1] 94 <- (0,361) [1] 95 <- (0,382) [1] 96 <- (0,392) [1] 97 <- (0,93) [1] 98 <- (0,154) [1] 99 <- (0,178) [1] 100 <- (0,204) [1] 101 <- (0,281) [1] 102 <- (0,291) [1] 103 <- (0,296) [1] 104 <- (0,298) [1] 105 <- (0,303) [1] 106 <- (0,308) [1] 107 <- (0,318) [1] 108 <- (0,339) [1] 109 <- (0,350) [1] 110 <- (0,351) [1] 111 <- (0,353) [1] 112 <- (0,355) [1] 113 <- (0,370) [1] 114 <- (0,379) [1] 115 <- (0,380) [1] 116 <- (0,388) [1] 117 <- (0,49) [1] 118 <- (0,50) [1] 119 <- (0,169) [1] 120 <- (0,246) [1] 121 <- (0,247) [1] 122 <- (0,9) [1] 123 <- (0,30) [1] 124 <- (0,135) [1] 125 <- (0,148) [1] 126 <- (0,184) [1] 127 <- (0,189) [1] 128 <- (0,229) [1] 129 <- (0,277) [1] 130 <- (0,320) [1] 131 <- (0,352) [1] 132 <- (0,393) [1] 133 <- (0,394) [1] 134 <- (0,398) [1] 135 <- (0,399) [1] 136 <- (0,400) [1] 137 <- (0,401) [1] 138 <- (0,403) [1] 139 <- (0,404) [1] 140 <- (0,406) [1] 141 <- (0,407) [1] 142 <- (0,408) [1] 143 <- (0,409) [1] 144 <- (0,410) [1] 145 <- (0,411) [1] 146 <- (0,424) [1] 147 <- (0,425) [1] 148 <- (0,426) [1] 149 <- (0,427) [1] 150 <- (0,428) [1] 151 <- (0,455) [1] 152 <- (0,456) [1] 153 <- (0,457) [1] 154 <- (0,464) [1] 155 <- (0,466) [1] 156 <- (0,475) [1] 157 <- (0,484) [1] 158 <- (0,490) [1] 159 <- (0,496) [1] 160 <- (0,497) [1] 161 <- (0,501) [1] 162 <- (0,509) [1] 163 <- (0,510) [1] 164 <- (0,512) [1] 165 <- (0,515) [1] 166 <- (0,520) [1] 167 <- (0,534) [1] 168 <- (0,537) [1] 169 <- (0,544) [1] 170 <- (0,548) [1] 171 <- (0,549) [1] 172 <- (0,555) [1] 173 <- (0,556) [1] 174 <- (0,560) [1] 175 <- (0,562) [1] 176 <- (0,567) [1] 177 <- (0,570) [1] 178 <- (0,576) [1] 179 <- (0,577) [1] 180 <- (0,579) [1] 181 <- (0,586) [1] 182 <- (0,587) [1] 183 <- (0,588) [1] 184 <- (0,590) [1] 185 <- (0,596) [1] 186 <- (0,609) [1] 187 <- (0,612) [1] 188 <- (0,615) [1] 189 <- (0,405) [1] 190 <- (0,423) [1] 191 <- (0,471) [1] 192 <- (0,477) [1] 193 <- (0,522) [1] 194 <- (0,533) [1] 195 <- (0,600) [1] 196 <- (0,603) [1] 197 <- (0,613) [1] 198 <- (0,616) [1] 199 <- (0,470) [1] 200 <- (0,479) [1] 201 <- (0,521) [1] 202 <- (0,531) [1] 203 <- (0,604) [1] 204 <- (0,412) [1] 205 <- (0,413) [1] 206 <- (0,452) [1] 207 <- (0,453) [1] 208 <- (0,478) [1] 209 <- (0,488) [1] 210 <- (0,502) [1] 211 <- (0,517) [1] 212 <- (0,565) [1] 213 <- (0,568) [1] 214 <- (0,574) [1] 215 <- (0,598) [1] 216 <- (0,608) [2] Number of roots=0, leaves=215, remote ranks=1 [2] 0 <- (0,17) [2] 1 <- (0,22) [2] 2 <- (0,35) [2] 3 <- (0,38) [2] 4 <- (0,49) [2] 5 <- (0,50) [2] 6 <- (0,52) [2] 7 <- (0,54) [2] 8 <- (0,55) [2] 9 <- (0,58) [2] 10 <- (0,64) [2] 11 <- (0,66) [2] 12 <- (0,69) [2] 13 <- (0,80) [2] 14 <- (0,82) [2] 15 <- (0,89) [2] 16 <- (0,92) [2] 17 <- (0,97) [2] 18 <- (0,99) [2] 19 <- (0,105) [2] 20 <- (0,106) [2] 21 <- (0,107) [2] 22 <- (0,109) [2] 23 <- (0,110) [2] 24 <- (0,117) [2] 25 <- (0,119) [2] 26 <- (0,126) [2] 27 <- (0,131) [2] 28 <- (0,139) [2] 29 <- (0,140) [2] 30 <- (0,146) [2] 31 <- (0,149) [2] 32 <- (0,150) [2] 33 <- (0,153) [2] 34 <- (0,155) [2] 35 <- (0,156) [2] 36 <- (0,157) [2] 37 <- (0,162) [2] 38 <- (0,169) [2] 39 <- (0,171) [2] 40 <- (0,177) [2] 41 <- (0,182) [2] 42 <- (0,188) [2] 43 <- (0,193) [2] 44 <- (0,194) [2] 45 <- (0,195) [2] 46 <- (0,196) [2] 47 <- (0,197) [2] 48 <- (0,199) [2] 49 <- (0,200) [2] 50 <- (0,201) [2] 51 <- (0,202) [2] 52 <- (0,205) [2] 53 <- (0,206) [2] 54 <- (0,214) [2] 55 <- (0,215) [2] 56 <- (0,217) [2] 57 <- (0,219) [2] 58 <- (0,230) [2] 59 <- (0,232) [2] 60 <- (0,234) [2] 61 <- (0,237) [2] 62 <- (0,238) [2] 63 <- (0,239) [2] 64 <- (0,242) [2] 65 <- (0,246) [2] 66 <- (0,247) [2] 67 <- (0,255) [2] 68 <- (0,257) [2] 69 <- (0,262) [2] 70 <- (0,263) [2] 71 <- (0,264) [2] 72 <- (0,270) [2] 73 <- (0,279) [2] 74 <- (0,286) [2] 75 <- (0,288) [2] 76 <- (0,289) [2] 77 <- (0,290) [2] 78 <- (0,300) [2] 79 <- (0,304) [2] 80 <- (0,305) [2] 81 <- (0,306) [2] 82 <- (0,307) [2] 83 <- (0,310) [2] 84 <- (0,311) [2] 85 <- (0,312) [2] 86 <- (0,314) [2] 87 <- (0,319) [2] 88 <- (0,322) [2] 89 <- (0,328) [2] 90 <- (0,348) [2] 91 <- (0,349) [2] 92 <- (0,358) [2] 93 <- (0,363) [2] 94 <- (0,367) [2] 95 <- (0,371) [2] 96 <- (0,374) [2] 97 <- (0,376) [2] 98 <- (0,74) [2] 99 <- (0,86) [2] 100 <- (0,95) [2] 101 <- (0,160) [2] 102 <- (0,213) [2] 103 <- (0,225) [2] 104 <- (0,231) [2] 105 <- (0,298) [2] 106 <- (0,345) [2] 107 <- (0,355) [2] 108 <- (0,356) [2] 109 <- (0,386) [2] 110 <- (0,70) [2] 111 <- (0,104) [2] 112 <- (0,136) [2] 113 <- (0,198) [2] 114 <- (0,8) [2] 115 <- (0,13) [2] 116 <- (0,91) [2] 117 <- (0,115) [2] 118 <- (0,122) [2] 119 <- (0,134) [2] 120 <- (0,164) [2] 121 <- (0,172) [2] 122 <- (0,301) [2] 123 <- (0,320) [2] 124 <- (0,325) [2] 125 <- (0,326) [2] 126 <- (0,329) [2] 127 <- (0,330) [2] 128 <- (0,334) [2] 129 <- (0,352) [2] 130 <- (0,357) [2] 131 <- (0,393) [2] 132 <- (0,396) [2] 133 <- (0,429) [2] 134 <- (0,430) [2] 135 <- (0,431) [2] 136 <- (0,432) [2] 137 <- (0,433) [2] 138 <- (0,434) [2] 139 <- (0,435) [2] 140 <- (0,436) [2] 141 <- (0,448) [2] 142 <- (0,449) [2] 143 <- (0,450) [2] 144 <- (0,451) [2] 145 <- (0,458) [2] 146 <- (0,459) [2] 147 <- (0,467) [2] 148 <- (0,468) [2] 149 <- (0,470) [2] 150 <- (0,472) [2] 151 <- (0,473) [2] 152 <- (0,476) [2] 153 <- (0,479) [2] 154 <- (0,481) [2] 155 <- (0,482) [2] 156 <- (0,495) [2] 157 <- (0,503) [2] 158 <- (0,505) [2] 159 <- (0,511) [2] 160 <- (0,516) [2] 161 <- (0,519) [2] 162 <- (0,521) [2] 163 <- (0,523) [2] 164 <- (0,524) [2] 165 <- (0,528) [2] 166 <- (0,530) [2] 167 <- (0,531) [2] 168 <- (0,536) [2] 169 <- (0,539) [2] 170 <- (0,540) [2] 171 <- (0,541) [2] 172 <- (0,546) [2] 173 <- (0,547) [2] 174 <- (0,550) [2] 175 <- (0,554) [2] 176 <- (0,559) [2] 177 <- (0,561) [2] 178 <- (0,566) [2] 179 <- (0,578) [2] 180 <- (0,580) [2] 181 <- (0,582) [2] 182 <- (0,592) [2] 183 <- (0,597) [2] 184 <- (0,601) [2] 185 <- (0,604) [2] 186 <- (0,614) [2] 187 <- (0,437) [2] 188 <- (0,460) [2] 189 <- (0,506) [2] 190 <- (0,514) [2] 191 <- (0,600) [2] 192 <- (0,606) [2] 193 <- (0,401) [2] 194 <- (0,520) [2] 195 <- (0,402) [2] 196 <- (0,446) [2] 197 <- (0,447) [2] 198 <- (0,452) [2] 199 <- (0,453) [2] 200 <- (0,454) [2] 201 <- (0,463) [2] 202 <- (0,483) [2] 203 <- (0,504) [2] 204 <- (0,507) [2] 205 <- (0,526) [2] 206 <- (0,527) [2] 207 <- (0,529) [2] 208 <- (0,552) [2] 209 <- (0,568) [2] 210 <- (0,574) [2] 211 <- (0,599) [2] 212 <- (0,608) [2] 213 <- (0,610) [2] 214 <- (0,611) [3] Number of roots=0, leaves=206, remote ranks=1 [3] 0 <- (0,2) [3] 1 <- (0,3) [3] 2 <- (0,7) [3] 3 <- (0,8) [3] 4 <- (0,9) [3] 5 <- (0,10) [3] 6 <- (0,13) [3] 7 <- (0,15) [3] 8 <- (0,16) [3] 9 <- (0,23) [3] 10 <- (0,24) [3] 11 <- (0,30) [3] 12 <- (0,36) [3] 13 <- (0,45) [3] 14 <- (0,46) [3] 15 <- (0,51) [3] 16 <- (0,53) [3] 17 <- (0,65) [3] 18 <- (0,67) [3] 19 <- (0,68) [3] 20 <- (0,72) [3] 21 <- (0,73) [3] 22 <- (0,77) [3] 23 <- (0,79) [3] 24 <- (0,84) [3] 25 <- (0,91) [3] 26 <- (0,101) [3] 27 <- (0,102) [3] 28 <- (0,112) [3] 29 <- (0,114) [3] 30 <- (0,115) [3] 31 <- (0,116) [3] 32 <- (0,120) [3] 33 <- (0,122) [3] 34 <- (0,123) [3] 35 <- (0,127) [3] 36 <- (0,134) [3] 37 <- (0,135) [3] 38 <- (0,137) [3] 39 <- (0,138) [3] 40 <- (0,141) [3] 41 <- (0,143) [3] 42 <- (0,144) [3] 43 <- (0,148) [3] 44 <- (0,158) [3] 45 <- (0,159) [3] 46 <- (0,164) [3] 47 <- (0,167) [3] 48 <- (0,168) [3] 49 <- (0,170) [3] 50 <- (0,172) [3] 51 <- (0,173) [3] 52 <- (0,174) [3] 53 <- (0,175) [3] 54 <- (0,179) [3] 55 <- (0,183) [3] 56 <- (0,184) [3] 57 <- (0,185) [3] 58 <- (0,189) [3] 59 <- (0,191) [3] 60 <- (0,192) [3] 61 <- (0,203) [3] 62 <- (0,209) [3] 63 <- (0,221) [3] 64 <- (0,222) [3] 65 <- (0,226) [3] 66 <- (0,229) [3] 67 <- (0,235) [3] 68 <- (0,236) [3] 69 <- (0,251) [3] 70 <- (0,252) [3] 71 <- (0,254) [3] 72 <- (0,259) [3] 73 <- (0,265) [3] 74 <- (0,266) [3] 75 <- (0,268) [3] 76 <- (0,271) [3] 77 <- (0,272) [3] 78 <- (0,276) [3] 79 <- (0,277) [3] 80 <- (0,293) [3] 81 <- (0,294) [3] 82 <- (0,297) [3] 83 <- (0,301) [3] 84 <- (0,302) [3] 85 <- (0,317) [3] 86 <- (0,320) [3] 87 <- (0,325) [3] 88 <- (0,326) [3] 89 <- (0,329) [3] 90 <- (0,330) [3] 91 <- (0,333) [3] 92 <- (0,334) [3] 93 <- (0,352) [3] 94 <- (0,357) [3] 95 <- (0,372) [3] 96 <- (0,383) [3] 97 <- (0,389) [3] 98 <- (0,393) [3] 99 <- (0,59) [3] 100 <- (0,121) [3] 101 <- (0,136) [3] 102 <- (0,163) [3] 103 <- (0,181) [3] 104 <- (0,186) [3] 105 <- (0,187) [3] 106 <- (0,198) [3] 107 <- (0,228) [3] 108 <- (0,299) [3] 109 <- (0,64) [3] 110 <- (0,80) [3] 111 <- (0,82) [3] 112 <- (0,105) [3] 113 <- (0,109) [3] 114 <- (0,110) [3] 115 <- (0,117) [3] 116 <- (0,139) [3] 117 <- (0,153) [3] 118 <- (0,193) [3] 119 <- (0,202) [3] 120 <- (0,206) [3] 121 <- (0,219) [3] 122 <- (0,246) [3] 123 <- (0,247) [3] 124 <- (0,304) [3] 125 <- (0,305) [3] 126 <- (0,367) [3] 127 <- (0,395) [3] 128 <- (0,402) [3] 129 <- (0,412) [3] 130 <- (0,413) [3] 131 <- (0,414) [3] 132 <- (0,415) [3] 133 <- (0,416) [3] 134 <- (0,417) [3] 135 <- (0,418) [3] 136 <- (0,419) [3] 137 <- (0,443) [3] 138 <- (0,444) [3] 139 <- (0,445) [3] 140 <- (0,446) [3] 141 <- (0,447) [3] 142 <- (0,452) [3] 143 <- (0,453) [3] 144 <- (0,454) [3] 145 <- (0,462) [3] 146 <- (0,463) [3] 147 <- (0,465) [3] 148 <- (0,474) [3] 149 <- (0,478) [3] 150 <- (0,480) [3] 151 <- (0,483) [3] 152 <- (0,486) [3] 153 <- (0,488) [3] 154 <- (0,489) [3] 155 <- (0,491) [3] 156 <- (0,499) [3] 157 <- (0,500) [3] 158 <- (0,502) [3] 159 <- (0,504) [3] 160 <- (0,507) [3] 161 <- (0,508) [3] 162 <- (0,513) [3] 163 <- (0,517) [3] 164 <- (0,518) [3] 165 <- (0,526) [3] 166 <- (0,527) [3] 167 <- (0,529) [3] 168 <- (0,535) [3] 169 <- (0,543) [3] 170 <- (0,552) [3] 171 <- (0,553) [3] 172 <- (0,557) [3] 173 <- (0,558) [3] 174 <- (0,563) [3] 175 <- (0,565) [3] 176 <- (0,568) [3] 177 <- (0,569) [3] 178 <- (0,572) [3] 179 <- (0,574) [3] 180 <- (0,575) [3] 181 <- (0,581) [3] 182 <- (0,583) [3] 183 <- (0,589) [3] 184 <- (0,595) [3] 185 <- (0,598) [3] 186 <- (0,599) [3] 187 <- (0,602) [3] 188 <- (0,608) [3] 189 <- (0,610) [3] 190 <- (0,611) [3] 191 <- (0,401) [3] 192 <- (0,411) [3] 193 <- (0,464) [3] 194 <- (0,556) [3] 195 <- (0,567) [3] 196 <- (0,448) [3] 197 <- (0,468) [3] 198 <- (0,476) [3] 199 <- (0,479) [3] 200 <- (0,519) [3] 201 <- (0,521) [3] 202 <- (0,530) [3] 203 <- (0,540) [3] 204 <- (0,561) [3] 205 <- (0,582) check natural SF PetscSF Object: 4 MPI processes type: basic sort=rank-order [0] Number of roots=0, leaves=0, remote ranks=0 [1] Number of roots=0, leaves=0, remote ranks=0 [2] Number of roots=0, leaves=0, remote ranks=0 [3] Number of roots=0, leaves=0, remote ranks=0 vector view - natural order Vec Object: 4 MPI processes type: mpi Process [0] 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. Process [1] 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. Process [2] 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. Process [3] 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. Set DMPlex local to global mapping, done Force to stop -------------- next part -------------- # vtk DataFile Version 2.0 unstructured mesh, Created by Gmsh ASCII DATASET UNSTRUCTURED_GRID POINTS 223 double 0 0 0 3 0 0 3 0 2 0 0 2 0.2 0 0.8 0.4 0 0.8 0.8 0 0.8 1.4 0 0.8 2.2 0 0.8 0.8 0 0.6 0.8 0 1 0.8 0 1.2 0.1999999999991203 0 0 0.3999999999983593 0 0 0.5999999999974546 0 0 0.799999999996742 0 0 0.9999999999960294 0 0 1.199999999995317 0 0 1.399999999994604 0 0 1.599999999994632 0 0 1.799999999995399 0 0 1.999999999996166 0 0 2.199999999996932 0 0 2.3999999999977 0 0 2.599999999998466 0 0 2.799999999999233 0 0 0 0 1.799999999999167 0 0 1.6 0 0 1.400000000001387 0 0 1.200000000002774 0 0 1.000000000004117 0 0 0.8000000000033287 0 0 0.6000000000024965 0 0 0.4000000000016644 0 0 0.200000000000832 2.800000000002174 0 2 2.600000000002023 0 2 2.400000000003 0 2 2.200000000002438 0 2 2.000000000001934 0 2 1.800000000002824 0 2 1.600000000003743 0 2 1.400000000003922 0 2 1.200000000003361 0 2 1.000000000002801 0 2 0.8000000000022411 0 2 0.600000000001681 0 2 0.4000000000011208 0 2 0.2000000000005602 0 2 3 0 0.1999999999996293 3 0 0.3999999999991157 3 0 0.5999999999985328 3 0 0.7999999999979498 3 0 0.9999999999973885 3 0 1.199999999997894 3 0 1.39999999999842 3 0 1.599999999998947 3 0 1.799999999999474 1.599999999999545 0 0.8 1.799999999998915 0 0.8 1.999999999999414 0 0.8 0.999999999999776 0 0.8 1.199999999999581 0 0.8 0.5999999999994576 0 0.8 2.379214504531294 0 1.405373076248138 1.704991835910612 0 1.402778606292781 1.23408069164269 0 1.458669309885507 0.4685665670601734 0 1.508019509283279 2.497738191218041 0 0.4349897042973965 2.610303185213958 0 0.9878017997043026 1.296692364559492 0 0.3986179228025756 1.909153318464776 0 0.4063140589542637 0.4623896075858526 0 0.3881084953993027 2.05931870173157 0 1.584337375247877 2.038721804512762 0 1.194360902254602 0.837837785556555 0 1.66744985915262 1.397714914184593 0 1.148692746366346 0.3742908223262418 0 1.174082908003071 2.665804146696016 0 1.663931138500056 1.497313596493072 0 1.673656798244939 2.227235772357356 0 0.3329268292685604 0.9938576986066314 0 0.3080338817095 2.697653501978651 0 1.292297570997389 1.096142514170803 0 1.1 1.605753072669749 0 0.491762523868872 1.69999999999923 0 1.091666666666887 2.71251200913381 0 0.7161239977234372 2.351344489130117 0 1.699121968393499 1.782173742909407 0 1.712990171238928 2.280062762525843 0 1.050264328024126 0.705769497964517 0 0.2623692964269909 0.2854025199473142 0 1.729673118535918 2.715992888350279 0 0.259203247264699 1.065208746302501 0 1.729902382621085 1.691666666666294 0 0.2437499999987736 2.440212763582161 0 0.7106750276238525 1.100121885885791 0 0.5682290676564412 2.10438472624376 0 0.5771286151995686 0.9442269078216747 0 1.355912991197506 0.6163286784502398 0 1.742953710918845 0.2780048027011408 0 1.402161836324629 1.478220357618803 0 1.470719894825082 0.2471457398099939 0 0.2524810787590218 0.2487655548760474 0 0.5238310750572215 0.6264666351387723 0 1.347553521562679 2.490235690235405 0 0.2265993265990937 2.803830918769202 0 0.526638964852175 0.5594040508164777 0 1.036741482837054 1.457982520923784 0 0.2269956302319235 1.937737252573877 0 1.391730183630887 2.796228535856523 0 1.102723737365208 2.159147416270142 0 1.403608979873389 1.281872426241454 0 1.800902035832503 1.8837415199635 0 1.003577707586214 2.03680937116415 0 0.2433849188490117 0.5795454545454293 0 0.5795454545453738 1.170008074655726 0 0.2233372172980658 2.801404909550425 0 1.492049198199022 0.2177437641725037 0 1.017743764172138 2.323894267418511 0 0.5113033981177416 1.172046878761573 0 1.270534731754538 0.5485564404307722 0 0.1918013452245567 2.083695878040242 0 1.807197232140322 1.416074624101432 0 0.5774396173238155 1.785479734433667 0 0.5752957350000555 2.501649089227999 0 1.271931709598009 1.284479965824385 0 0.9903690252861073 1.523327777985623 0 1.000647820252583 0.5364474308195638 0 1.217014743936919 2.789881440617124 0 1.789881440617261 2.507494844449481 0 1.809810027645477 0.9231700938715783 0 1.851873008429856 2.111938913072513 0 0.9830530884882849 2.793154307858761 0 0.8768075521287002 2.587000975789033 0 1.453818519623165 2.394803322025814 0 0.8926720437521438 1.834919049360319 0 1.223481514480617 1.573104508099891 0 1.238404636402246 0.4475743170520548 0 1.838276232017855 0.1815881432411806 0 1.217598300523445 0.8672650147563506 0 0.1762453176299749 2.306587108970027 0 0.1705150816205509 1.625410855356525 0 1.780289239734096 0.8233833894553626 0 0.3832672040340161 0.1696953830832089 0 1.570979889137452 1.65389327768683 0 1.587851501742204 2.261020253229125 0 1.250190685669392 1.892425939574817 0 1.84614964303553 1.024101021611565 0 1.523932646626128 1.865603535956367 0 0.205257814827333 0.3933113884489506 0 0.9690646432006123 0.4507987812721571 0 1.338822405420034 2.274894458806367 0 1.544832506227184 1.838053030422587 0 1.554838056794376 0.3686859183416712 0 0.61430085724029 0.9738015786914841 0 0.9738015786915135 1.332511029850933 0 1.615932245360217 0.9894053075619311 0 1.229077937292363 2.577637560506477 0 0.8159297007077034 2.375571518833167 0 0.3445824808775385 2.476279702749816 0 1.597328424196379 0.1460062754235328 0 0.3970029733771729 1.265344920154252 0 0.6329578502716723 1.935835647926898 0 0.6344934670553749 2.069281594968028 0 0.3940954484269742 2.29506823171605 0 1.851926700449395 0.6361007615483169 0 0.4081024174896069 2.863891877583948 0 1.299999999998157 1.121879385586257 0 0.3968719495845081 2.866792480267094 0 0.6999999999982413 0.640716822374706 0 1.564448591963968 1.465065170338345 0 0.3920979873954564 1.461565556697408 0 1.846878377665404 1.313202543621524 0 0.1520409031873539 1.700610872801345 0 0.9324817812131065 1.771531265638171 0 0.3844760265298595 0.9316739727448209 0 0.6501584876562788 0.1553841991341328 0 1.844615800865806 2.83194205855847 0 0.1548176119178121 0.7554673115764591 0 1.852455315700264 1.543297705630606 0 0.6692863615262314 2.573954635025233 0 0.5925173061601432 1.116582444085043 0 0.9359264074853757 1.025953898186142 0 0.1576523269573968 2.847994284561111 0 1.642990843504484 0.9600149914842826 0 0.4894980205592772 2.205556830158146 0 1.69321441508239 2.627328673572279 0 0.1531224420159892 2.642687904572294 0 1.141659092120952 2.834577319798835 0 0.3225122389814655 0.2983336950301236 0 1.882071991275122 0.8122248620838787 0 1.44321626841715 0.6996996928159201 0 0.1181368787597825 0.1444936133652446 0 0.1476295673217159 0.1529060387700887 0 0.638313084543821 2.293874040543968 0 0.6803984596738974 0.3457826007705083 0 0.1617330481724948 0.1220964204329312 0 1.709053761707669 2.657398203846927 0 1.861014030866323 0.3295446886489339 0 1.593181052285239 1.098772336102051 0 1.885940614691867 2.652721607232833 0 0.3937434646924954 0.6818782899869689 0 0.9475655802543401 1.931167058145805 0 1.69635186532083 1.581219843306312 0 0.1461740902663257 2.447705348719867 0 1.095025095160731 1.257592110345052 0 1.147316202898446 1.757518958348691 0 1.860358589513101 2.155116155877665 0 0.1392252118453306 0.6673660677936304 0 1.124812554765165 1.35385707705373 0 1.313046155723573 0.4646889322555703 0 1.662758702500851 1.187554783129829 0 1.625867724065088 1.077557767903738 0 1.371623591692298 1.680167363715666 0 0.6787096877578693 0.6812821612349281 0 0.6812821612351486 2.246701249406042 0 0.9314973650661384 2.169432754969319 0 1.119127586148288 0.1164067881991524 0 0.916406788200873 0.1014459250697099 0 1.100000000003446 2.890346327029697 0 1.890346327028477 0.3102787716159667 0 0.3755545051061105 0.9318698800848435 0 1.100575903196775 CELLS 469 1789 1 0 1 1 1 2 1 3 1 4 1 5 1 6 1 7 1 8 1 9 1 10 1 11 2 0 12 2 12 13 2 13 14 2 14 15 2 15 16 2 16 17 2 17 18 2 18 19 2 19 20 2 20 21 2 21 22 2 22 23 2 23 24 2 24 25 2 25 1 2 3 26 2 26 27 2 27 28 2 28 29 2 29 30 2 30 31 2 31 32 2 32 33 2 33 34 2 34 0 2 2 35 2 35 36 2 36 37 2 37 38 2 38 39 2 39 40 2 40 41 2 41 42 2 42 43 2 43 44 2 44 45 2 45 46 2 46 47 2 47 48 2 48 3 2 1 49 2 49 50 2 50 51 2 51 52 2 52 53 2 53 54 2 54 55 2 55 56 2 56 57 2 57 2 2 7 58 2 58 59 2 59 60 2 60 8 2 6 61 2 61 62 2 62 7 2 5 63 2 63 6 2 4 5 2 6 9 2 10 6 2 11 10 3 209 104 128 3 103 154 194 3 164 80 119 3 97 164 119 3 11 104 209 3 194 154 4 3 61 96 62 3 8 60 97 3 69 158 133 3 84 171 94 3 187 23 24 3 63 107 150 3 28 139 100 3 69 133 110 3 67 151 104 3 20 94 204 3 105 23 187 3 79 101 145 3 11 98 191 3 103 194 161 3 102 161 193 3 75 93 131 3 65 145 101 3 95 119 181 3 181 119 68 3 194 32 161 3 193 161 34 3 5 63 150 3 4 154 5 3 121 72 196 3 94 171 108 3 75 148 93 3 72 221 196 3 11 191 104 3 46 138 99 3 65 137 136 3 84 94 175 3 75 179 99 3 74 111 109 3 28 100 144 3 61 176 96 3 104 151 128 3 103 221 154 3 70 162 96 3 62 96 162 3 20 149 94 3 71 97 163 3 221 72 154 3 4 150 118 3 85 137 127 3 76 127 137 3 60 163 97 3 85 136 137 3 19 20 204 3 64 125 134 3 82 134 125 3 4 5 150 3 70 96 168 3 73 109 111 3 70 173 108 3 63 154 115 3 28 29 139 3 32 33 161 3 33 34 161 3 125 146 205 3 71 175 149 3 65 101 137 3 71 164 97 3 50 51 106 3 79 156 101 3 7 127 126 3 9 115 166 3 92 201 105 3 68 105 201 3 66 101 156 3 70 116 173 3 9 166 143 3 94 149 175 3 5 154 63 3 8 97 195 3 53 54 110 3 75 99 170 3 89 205 146 3 11 157 98 3 86 133 158 3 46 99 179 3 66 210 101 3 75 131 179 3 90 166 121 3 73 111 152 3 72 121 166 3 59 60 113 3 74 146 111 3 77 107 128 3 77 100 139 3 42 43 112 3 77 151 100 3 65 136 109 3 67 199 100 3 64 146 125 3 67 100 151 3 8 195 135 3 86 181 106 3 83 120 157 3 76 126 127 3 74 113 132 3 78 134 117 3 82 117 134 3 27 28 144 3 85 113 136 3 74 136 113 3 46 47 138 3 92 105 187 3 67 104 170 3 23 105 141 3 53 110 133 3 106 181 201 3 54 167 110 3 120 213 157 3 55 56 117 3 71 163 124 3 70 123 162 3 60 132 113 3 95 135 195 3 6 61 155 3 6 155 10 3 73 153 109 3 71 149 114 3 7 126 62 3 102 196 221 3 63 202 107 3 73 152 186 3 90 140 143 3 81 143 140 3 59 113 174 3 18 204 108 3 7 58 127 3 51 169 106 3 86 106 169 3 82 110 167 3 36 37 130 3 21 114 149 3 72 115 154 3 50 106 189 3 68 159 105 3 17 116 183 3 74 109 136 3 13 14 121 3 94 108 204 3 38 39 122 3 101 210 137 3 43 200 112 3 75 170 191 3 85 174 113 3 77 150 107 3 37 165 130 3 88 142 145 3 79 145 142 3 25 178 187 3 106 201 189 3 66 120 210 3 93 112 200 3 64 160 152 3 7 162 123 3 8 132 60 3 16 17 183 3 44 45 131 3 59 124 163 3 92 189 201 3 76 137 210 3 84 124 214 3 65 109 153 3 52 53 133 3 68 201 181 3 8 135 216 3 105 159 141 3 104 191 170 3 64 134 160 3 77 139 118 3 97 119 195 3 15 16 140 3 7 123 180 3 87 130 165 3 20 21 149 3 18 19 204 3 24 25 187 3 18 108 173 3 70 108 171 3 64 111 146 3 84 214 180 3 7 62 162 3 80 141 159 3 59 163 60 3 69 110 188 3 87 160 130 3 78 130 160 3 64 152 111 3 57 129 184 3 7 180 58 3 82 167 117 3 55 117 167 3 38 122 165 3 82 188 110 3 21 208 114 3 77 118 150 3 78 129 198 3 74 217 146 3 75 191 148 3 90 121 192 3 80 114 208 3 12 196 193 3 6 176 61 3 6 9 176 3 79 112 156 3 78 117 184 3 78 198 130 3 44 131 200 3 122 186 165 3 14 192 121 3 74 132 217 3 3 26 177 3 1 49 178 3 1 178 25 3 3 177 48 3 93 212 112 3 79 172 112 3 68 119 159 3 93 200 131 3 70 171 123 3 84 123 171 3 56 57 184 3 42 112 172 3 73 122 203 3 77 128 151 3 73 203 153 3 71 114 164 3 80 164 114 3 39 147 122 3 39 40 147 3 56 184 117 3 98 148 191 3 91 211 138 3 78 184 129 3 70 168 116 3 81 116 168 3 17 173 116 3 85 127 174 3 58 174 127 3 63 115 215 3 9 215 115 3 91 199 211 3 84 175 124 3 71 124 175 3 72 166 115 3 80 159 119 3 40 41 207 3 81 183 116 3 41 42 172 3 12 13 196 3 22 23 141 3 4 118 218 3 16 183 140 3 35 198 129 3 87 165 186 3 41 142 207 3 52 133 169 3 86 169 133 3 45 46 179 3 21 22 208 3 98 157 213 3 41 172 142 3 95 195 119 3 92 187 178 3 66 213 120 3 102 193 196 3 100 199 144 3 59 214 124 3 84 180 123 3 61 62 182 3 82 125 188 3 83 206 120 3 10 209 202 3 31 32 194 3 4 31 194 3 0 193 34 3 0 12 193 3 35 36 198 3 13 121 196 3 73 186 122 3 122 147 203 3 87 152 160 3 83 182 126 3 62 126 182 3 86 158 181 3 95 181 158 3 90 143 166 3 83 126 206 3 80 208 141 3 76 206 126 3 17 18 173 3 65 153 145 3 69 135 158 3 95 158 135 3 107 209 128 3 69 188 205 3 125 205 188 3 35 129 220 3 57 220 129 3 10 11 209 3 9 143 185 3 37 38 165 3 78 160 134 3 54 55 167 3 45 179 131 3 36 130 198 3 6 202 63 3 6 10 202 3 51 52 169 3 107 202 209 3 88 145 153 3 58 59 174 3 61 182 155 3 79 142 172 3 66 148 213 3 98 213 148 3 89 132 216 3 8 216 132 3 9 185 176 3 40 207 147 3 69 205 135 3 89 135 205 3 6 63 215 3 6 215 9 3 49 50 189 3 89 217 132 3 81 140 183 3 81 168 185 3 96 185 168 3 47 48 190 3 118 139 219 3 30 31 218 3 4 218 31 3 14 15 192 3 15 140 192 3 90 192 140 3 112 212 156 3 26 27 197 3 43 44 200 3 2 220 57 3 2 35 220 3 83 222 155 3 10 222 11 3 58 214 59 3 29 30 219 3 83 157 222 3 76 210 206 3 120 206 210 3 89 216 135 3 87 186 152 3 103 161 221 3 91 138 190 3 81 185 143 3 47 190 138 3 88 153 203 3 66 212 148 3 93 148 212 3 91 197 144 3 89 146 217 3 99 138 211 3 29 219 139 3 83 155 182 3 88 207 142 3 22 141 208 3 27 144 197 3 88 147 207 3 91 144 199 3 88 203 147 3 67 170 211 3 99 211 170 3 30 218 219 3 118 219 218 3 67 211 199 3 96 176 185 3 49 189 178 3 91 177 197 3 48 177 190 3 66 156 212 3 91 190 177 3 10 155 222 3 92 178 189 3 11 222 157 3 26 197 177 3 102 221 161 3 58 180 214 CELL_TYPES 469 1 1 1 1 1 1 1 1 1 1 1 1 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 From knepley at gmail.com Thu Nov 29 20:13:56 2018 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 29 Nov 2018 21:13:56 -0500 Subject: [petsc-users] [petsc-maint] Fwd: DMPlex global to natural problem using DmPlexGetVertexNumbering or DMPlexGlobalToNatural In-Reply-To: <9dda31a1-8ec6-f571-1ceb-ed4a23426bac@gmail.com> References: <16c44759-2ab2-e9ed-2c03-98f39738caa2@gmail.com> <9dda31a1-8ec6-f571-1ceb-ed4a23426bac@gmail.com> Message-ID: On Thu, Nov 29, 2018 at 7:40 PM Danyang Su via petsc-maint < petsc-maint at mcs.anl.gov> wrote: > Dear PETSc developers & users, > > Sorry to bother you again. I just encounter some difficulties in DMPlex > global to natural order. This is not the very necessary function in my > code. But it's best to have it in case someone wants to feed code with > initial conditions or parameters from external file using natural ordering. > > First I tried to use DmPlexGetVertexNumbering which seems pretty > straightforward, but I always get error saying "You need a ISO C > conforming compiler to use the glibc headers". I use Gfortran on Linux. > > Then I switched to use Label to save global-natural order before > distributing the mesh, everything works fine. The only problem is that > it takes a very long time during mesh distribution when the size of mesh > is very big. > > Finally I tried to use GlobalToNatural ordering. Set Natural order to > TRUE before calling DMPlexDistribute and then call DMPlexSetMigrationSF. > However, the function DMPlexGlobalToNaturalEnd seems doing nothing as > the returned vector is always unchanged. I got some help from Josh who > has done similar work before. But I still cannot figure out what is > wrong in the code. > Can you run the example that Blaise recommended? Sometimes its easiest to start with an example and change it into your code. Thanks, Matt > Attached is the code section with related functions included, together > with a mesh and output of global vector and natural vector. If something > is wrong in the code, it should be in these four functions. The related > functions are called in the following order. > > call solver_dd_create_dmplex > > call solver_dd_mapping_set_dmplex > > call solver_dd_DMDACreate_flow > > !call solver_dd_DMDACreate_react Not used in current testing > > call solver_dd_mapping_global_natural > > I really appreciate your help. > > Regards, > > Danyang > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Thu Nov 29 20:26:28 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Thu, 29 Nov 2018 20:26:28 -0600 Subject: [petsc-users] Problem with large grid size In-Reply-To: References: Message-ID: Thanks. I have configured with 64-bit and then when I run, I got the following error message: *********************************************************************************************************************** [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Out of memory. This could be due to allocating [0]PETSC ERROR: too large an object or bleeding by not properly [0]PETSC ERROR: destroying unneeded objects. [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152 [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. [0]PETSC ERROR: Memory requested 1614907707076 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018 [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c [0]PETSC ERROR: #3 PetscMallocA() line 397 in /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c [0]PETSC ERROR: #5 VecSetType() line 51 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c [0]PETSC ERROR: #8 main() line 57 in /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c [0]PETSC ERROR: No PETSc Option Table entries [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0 [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55 : system msg for write_line failure : Bad file descriptor ************************************************************************************************************************************ Thanks. Sincerely, Huq On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. wrote: > > > > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Hello PETSc Developers, > > > > I am trying to run the code (attached herewith) with the following > command and it works until the size of the matrix is 99999X99999. But when > I try to run with 999999X999999 then I got weird result. > > What is that "weird result"? > > My guess is for problems that large you need to ./configure PETSc with > the additional option --with-64-bit-indices since for such large problems > 32 bit integers are not large enough to contain values needed for storing > and accessing the sparse matrix. > > Barry > > > > > The command is: > > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg > -ksp_view_solution > > Any suggestions is appreciated. > > > > Thanks. > > Sincerely, > > Huq > > > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > > > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Nov 29 20:45:53 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 30 Nov 2018 02:45:53 +0000 Subject: [petsc-users] Problem with large grid size In-Reply-To: References: Message-ID: <4F8C4EDE-FA4E-4981-9079-A32C7325F212@mcs.anl.gov> You need to run it on more processors, this one processor doesn't have enough memory to fit the vectors (which by the way are huge 1,614,907,707,076) Barry > On Nov 29, 2018, at 8:26 PM, Fazlul Huq wrote: > > Thanks. > > I have configured with 64-bit and then when I run, I got the following error message: > *********************************************************************************************************************** > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Out of memory. This could be due to allocating > [0]PETSC ERROR: too large an object or bleeding by not properly > [0]PETSC ERROR: destroying unneeded objects. > [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152 > [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > [0]PETSC ERROR: Memory requested 1614907707076 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018 > [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices > [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c > [0]PETSC ERROR: #3 PetscMallocA() line 397 in /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c > [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > [0]PETSC ERROR: #5 VecSetType() line 51 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c > [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > [0]PETSC ERROR: #8 main() line 57 in /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c > [0]PETSC ERROR: No PETSc Option Table entries > [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55 > : > system msg for write_line failure : Bad file descriptor > ************************************************************************************************************************************ > > Thanks. > Sincerely, > Huq > > On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. wrote: > > > > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users wrote: > > > > Hello PETSc Developers, > > > > I am trying to run the code (attached herewith) with the following command and it works until the size of the matrix is 99999X99999. But when I try to run with 999999X999999 then I got weird result. > > What is that "weird result"? > > My guess is for problems that large you need to ./configure PETSc with the additional option --with-64-bit-indices since for such large problems 32 bit integers are not large enough to contain values needed for storing and accessing the sparse matrix. > > Barry > > > > > The command is: > > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg -ksp_view_solution > > Any suggestions is appreciated. > > > > Thanks. > > Sincerely, > > Huq > > > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > > > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com From huq2090 at gmail.com Thu Nov 29 20:54:27 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Thu, 29 Nov 2018 20:54:27 -0600 Subject: [petsc-users] Problem with large grid size In-Reply-To: References: Message-ID: Sorry! I made mistake in running the code. It's actually working now until matrix size of 9999999! But when I try to extend to one order more it give me error message: ************************************************************************************************************************** Out of memory trying to allocate 799999992 bytes [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: likely location of problem given in stack below [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: [0] HYPRE_SetupXXX line 322 /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c [0]PETSC ERROR: [0] PCSetUp_HYPRE line 138 /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c [0]PETSC ERROR: [0] PCSetUp line 894 /home/huq2090/petsc-3.10.2/src/ksp/pc/interface/precon.c [0]PETSC ERROR: [0] KSPSetUp line 304 /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: [0] KSPSolve line 678 /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:47:50 2018 [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices [0]PETSC ERROR: #1 User provided function() line 0 in unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59 : system msg for write_line failure : Bad file descriptor ************************************************************************************************************************************** I am using dell xps laptop with 32 GB RAM, Intel 8th generation core i7 processor. Since the RAM is 32 GB, is there any way to allocate even higher order problem in my machine? Thanks. Sincerely, Huq On Thu, Nov 29, 2018 at 8:26 PM Fazlul Huq wrote: > Thanks. > > I have configured with 64-bit and then when I run, I got the following > error message: > > *********************************************************************************************************************** > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Out of memory. This could be due to allocating > [0]PETSC ERROR: too large an object or bleeding by not properly > [0]PETSC ERROR: destroying unneeded objects. > [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152 > [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > [0]PETSC ERROR: Memory requested 1614907707076 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named > huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018 > [0]PETSC ERROR: Configure options --download-hypre --download-mpich > --with-64-bit-indices > [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in > /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in > /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c > [0]PETSC ERROR: #3 PetscMallocA() line 397 in > /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c > [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in > /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > [0]PETSC ERROR: #5 VecSetType() line 51 in > /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c > [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in > /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in > /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > [0]PETSC ERROR: #8 main() line 57 in > /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c > [0]PETSC ERROR: No PETSc Option Table entries > [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55 > : > system msg for write_line failure : Bad file descriptor > > ************************************************************************************************************************************ > > Thanks. > Sincerely, > Huq > > On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. > wrote: > >> >> >> > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> > >> > Hello PETSc Developers, >> > >> > I am trying to run the code (attached herewith) with the following >> command and it works until the size of the matrix is 99999X99999. But when >> I try to run with 999999X999999 then I got weird result. >> >> What is that "weird result"? >> >> My guess is for problems that large you need to ./configure PETSc with >> the additional option --with-64-bit-indices since for such large problems >> 32 bit integers are not large enough to contain values needed for storing >> and accessing the sparse matrix. >> >> Barry >> >> > >> > The command is: >> > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg >> -ksp_view_solution >> > Any suggestions is appreciated. >> > >> > Thanks. >> > Sincerely, >> > Huq >> > >> > >> > -- >> > >> > Fazlul Huq >> > Graduate Research Assistant >> > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> > University of Illinois at Urbana-Champaign (UIUC) >> > E-mail: huq2090 at gmail.com >> > >> >> > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Thu Nov 29 20:55:35 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Thu, 29 Nov 2018 20:55:35 -0600 Subject: [petsc-users] Problem with large grid size In-Reply-To: References: Message-ID: So, if I try to run the same code in a super computer then I can run with even larger matrix? Thanks. Sincerely, Huq On Thu, Nov 29, 2018 at 8:54 PM Fazlul Huq wrote: > Sorry! I made mistake in running the code. > > It's actually working now until matrix size of 9999999! > > But when I try to extend to one order more it give me error message: > > ************************************************************************************************************************** > Out of memory trying to allocate 799999992 bytes > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS > X to find memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] HYPRE_SetupXXX line 322 > /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c > [0]PETSC ERROR: [0] PCSetUp_HYPRE line 138 > /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c > [0]PETSC ERROR: [0] PCSetUp line 894 > /home/huq2090/petsc-3.10.2/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: [0] KSPSetUp line 304 > /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: [0] KSPSolve line 678 > /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Signal received > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named > huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:47:50 2018 > [0]PETSC ERROR: Configure options --download-hypre --download-mpich > --with-64-bit-indices > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59 > : > system msg for write_line failure : Bad file descriptor > > ************************************************************************************************************************************** > > I am using dell xps laptop with 32 GB RAM, Intel 8th generation core i7 > processor. > Since the RAM is 32 GB, is there any way to allocate even higher order > problem in my machine? > > Thanks. > Sincerely, > Huq > > > > On Thu, Nov 29, 2018 at 8:26 PM Fazlul Huq wrote: > >> Thanks. >> >> I have configured with 64-bit and then when I run, I got the following >> error message: >> >> *********************************************************************************************************************** >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: Out of memory. This could be due to allocating >> [0]PETSC ERROR: too large an object or bleeding by not properly >> [0]PETSC ERROR: destroying unneeded objects. >> [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152 >> [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. >> [0]PETSC ERROR: Memory requested 1614907707076 >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >> for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 >> [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named >> huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018 >> [0]PETSC ERROR: Configure options --download-hypre --download-mpich >> --with-64-bit-indices >> [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in >> /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c >> [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in >> /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c >> [0]PETSC ERROR: #3 PetscMallocA() line 397 in >> /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c >> [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in >> /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c >> [0]PETSC ERROR: #5 VecSetType() line 51 in >> /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c >> [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in >> /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c >> [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in >> /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c >> [0]PETSC ERROR: #8 main() line 57 in >> /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c >> [0]PETSC ERROR: No PETSc Option Table entries >> [0]PETSC ERROR: ----------------End of Error Message -------send entire >> error message to petsc-maint at mcs.anl.gov---------- >> application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0 >> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55 >> : >> system msg for write_line failure : Bad file descriptor >> >> ************************************************************************************************************************************ >> >> Thanks. >> Sincerely, >> Huq >> >> On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. >> wrote: >> >>> >>> >>> > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> > >>> > Hello PETSc Developers, >>> > >>> > I am trying to run the code (attached herewith) with the following >>> command and it works until the size of the matrix is 99999X99999. But when >>> I try to run with 999999X999999 then I got weird result. >>> >>> What is that "weird result"? >>> >>> My guess is for problems that large you need to ./configure PETSc >>> with the additional option --with-64-bit-indices since for such large >>> problems 32 bit integers are not large enough to contain values needed for >>> storing and accessing the sparse matrix. >>> >>> Barry >>> >>> > >>> > The command is: >>> > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg >>> -ksp_view_solution >>> > Any suggestions is appreciated. >>> > >>> > Thanks. >>> > Sincerely, >>> > Huq >>> > >>> > >>> > -- >>> > >>> > Fazlul Huq >>> > Graduate Research Assistant >>> > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >>> > University of Illinois at Urbana-Champaign (UIUC) >>> > E-mail: huq2090 at gmail.com >>> > >>> >>> >> >> -- >> >> Fazlul Huq >> Graduate Research Assistant >> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> University of Illinois at Urbana-Champaign (UIUC) >> E-mail: huq2090 at gmail.com >> > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Nov 29 21:00:14 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 30 Nov 2018 03:00:14 +0000 Subject: [petsc-users] Problem with large grid size In-Reply-To: References: Message-ID: > On Nov 29, 2018, at 8:55 PM, Fazlul Huq wrote: > > So, if I try to run the same code in a super computer then I can run with even larger matrix? Within reason. But every time you increase n by 10 it will require more than 100 times as much memory so even on the biggest machine you are limited in how large a problem you can run. Barry > > Thanks. > > Sincerely, > Huq > > On Thu, Nov 29, 2018 at 8:54 PM Fazlul Huq wrote: > Sorry! I made mistake in running the code. > > It's actually working now until matrix size of 9999999! > > But when I try to extend to one order more it give me error message: > ************************************************************************************************************************** > Out of memory trying to allocate 799999992 bytes > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] HYPRE_SetupXXX line 322 /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c > [0]PETSC ERROR: [0] PCSetUp_HYPRE line 138 /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c > [0]PETSC ERROR: [0] PCSetUp line 894 /home/huq2090/petsc-3.10.2/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: [0] KSPSetUp line 304 /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: [0] KSPSolve line 678 /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Signal received > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:47:50 2018 > [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59 > : > system msg for write_line failure : Bad file descriptor > ************************************************************************************************************************************** > > I am using dell xps laptop with 32 GB RAM, Intel 8th generation core i7 processor. > Since the RAM is 32 GB, is there any way to allocate even higher order problem in my machine? > > Thanks. > Sincerely, > Huq > > > > On Thu, Nov 29, 2018 at 8:26 PM Fazlul Huq wrote: > Thanks. > > I have configured with 64-bit and then when I run, I got the following error message: > *********************************************************************************************************************** > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Out of memory. This could be due to allocating > [0]PETSC ERROR: too large an object or bleeding by not properly > [0]PETSC ERROR: destroying unneeded objects. > [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152 > [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > [0]PETSC ERROR: Memory requested 1614907707076 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018 > [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices > [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c > [0]PETSC ERROR: #3 PetscMallocA() line 397 in /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c > [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > [0]PETSC ERROR: #5 VecSetType() line 51 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c > [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > [0]PETSC ERROR: #8 main() line 57 in /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c > [0]PETSC ERROR: No PETSc Option Table entries > [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55 > : > system msg for write_line failure : Bad file descriptor > ************************************************************************************************************************************ > > Thanks. > Sincerely, > Huq > > On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. wrote: > > > > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users wrote: > > > > Hello PETSc Developers, > > > > I am trying to run the code (attached herewith) with the following command and it works until the size of the matrix is 99999X99999. But when I try to run with 999999X999999 then I got weird result. > > What is that "weird result"? > > My guess is for problems that large you need to ./configure PETSc with the additional option --with-64-bit-indices since for such large problems 32 bit integers are not large enough to contain values needed for storing and accessing the sparse matrix. > > Barry > > > > > The command is: > > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg -ksp_view_solution > > Any suggestions is appreciated. > > > > Thanks. > > Sincerely, > > Huq > > > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > > > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com From bsmith at mcs.anl.gov Thu Nov 29 21:00:39 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 30 Nov 2018 03:00:39 +0000 Subject: [petsc-users] Problem with large grid size In-Reply-To: References: Message-ID: <05F3B4E9-335E-4F15-ABA3-1BF4481D2FB5@mcs.anl.gov> No > On Nov 29, 2018, at 8:54 PM, Fazlul Huq wrote: > > Sorry! I made mistake in running the code. > > It's actually working now until matrix size of 9999999! > > But when I try to extend to one order more it give me error message: > ************************************************************************************************************************** > Out of memory trying to allocate 799999992 bytes > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] HYPRE_SetupXXX line 322 /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c > [0]PETSC ERROR: [0] PCSetUp_HYPRE line 138 /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c > [0]PETSC ERROR: [0] PCSetUp line 894 /home/huq2090/petsc-3.10.2/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: [0] KSPSetUp line 304 /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: [0] KSPSolve line 678 /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Signal received > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:47:50 2018 > [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59 > : > system msg for write_line failure : Bad file descriptor > ************************************************************************************************************************************** > > I am using dell xps laptop with 32 GB RAM, Intel 8th generation core i7 processor. > Since the RAM is 32 GB, is there any way to allocate even higher order problem in my machine? > > Thanks. > Sincerely, > Huq > > > > On Thu, Nov 29, 2018 at 8:26 PM Fazlul Huq wrote: > Thanks. > > I have configured with 64-bit and then when I run, I got the following error message: > *********************************************************************************************************************** > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Out of memory. This could be due to allocating > [0]PETSC ERROR: too large an object or bleeding by not properly > [0]PETSC ERROR: destroying unneeded objects. > [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152 > [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > [0]PETSC ERROR: Memory requested 1614907707076 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018 > [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices > [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c > [0]PETSC ERROR: #3 PetscMallocA() line 397 in /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c > [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > [0]PETSC ERROR: #5 VecSetType() line 51 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c > [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > [0]PETSC ERROR: #8 main() line 57 in /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c > [0]PETSC ERROR: No PETSc Option Table entries > [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55 > : > system msg for write_line failure : Bad file descriptor > ************************************************************************************************************************************ > > Thanks. > Sincerely, > Huq > > On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. wrote: > > > > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users wrote: > > > > Hello PETSc Developers, > > > > I am trying to run the code (attached herewith) with the following command and it works until the size of the matrix is 99999X99999. But when I try to run with 999999X999999 then I got weird result. > > What is that "weird result"? > > My guess is for problems that large you need to ./configure PETSc with the additional option --with-64-bit-indices since for such large problems 32 bit integers are not large enough to contain values needed for storing and accessing the sparse matrix. > > Barry > > > > > The command is: > > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg -ksp_view_solution > > Any suggestions is appreciated. > > > > Thanks. > > Sincerely, > > Huq > > > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > > > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com > > > -- > > Fazlul Huq > Graduate Research Assistant > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > University of Illinois at Urbana-Champaign (UIUC) > E-mail: huq2090 at gmail.com From huq2090 at gmail.com Thu Nov 29 21:06:44 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Thu, 29 Nov 2018 21:06:44 -0600 Subject: [petsc-users] Problem with large grid size In-Reply-To: <05F3B4E9-335E-4F15-ABA3-1BF4481D2FB5@mcs.anl.gov> References: <05F3B4E9-335E-4F15-ABA3-1BF4481D2FB5@mcs.anl.gov> Message-ID: Thanks a lot. Sincerely, Huq On Thu, Nov 29, 2018 at 9:00 PM Smith, Barry F. wrote: > No > > > > On Nov 29, 2018, at 8:54 PM, Fazlul Huq wrote: > > > > Sorry! I made mistake in running the code. > > > > It's actually working now until matrix size of 9999999! > > > > But when I try to extend to one order more it give me error message: > > > ************************************************************************************************************************** > > Out of memory trying to allocate 799999992 bytes > > [0]PETSC ERROR: > ------------------------------------------------------------------------ > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > > [0]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac > OS X to find memory corruption errors > > [0]PETSC ERROR: likely location of problem given in stack below > > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > > [0]PETSC ERROR: INSTEAD the line number of the start of the > function > > [0]PETSC ERROR: is given. > > [0]PETSC ERROR: [0] HYPRE_SetupXXX line 322 > /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c > > [0]PETSC ERROR: [0] PCSetUp_HYPRE line 138 > /home/huq2090/petsc-3.10.2/src/ksp/pc/impls/hypre/hypre.c > > [0]PETSC ERROR: [0] PCSetUp line 894 > /home/huq2090/petsc-3.10.2/src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: [0] KSPSetUp line 304 > /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: [0] KSPSolve line 678 > /home/huq2090/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [0]PETSC ERROR: Signal received > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named > huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:47:50 2018 > > [0]PETSC ERROR: Configure options --download-hypre --download-mpich > --with-64-bit-indices > > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59 > > : > > system msg for write_line failure : Bad file descriptor > > > ************************************************************************************************************************************** > > > > I am using dell xps laptop with 32 GB RAM, Intel 8th generation core i7 > processor. > > Since the RAM is 32 GB, is there any way to allocate even higher order > problem in my machine? > > > > Thanks. > > Sincerely, > > Huq > > > > > > > > On Thu, Nov 29, 2018 at 8:26 PM Fazlul Huq wrote: > > Thanks. > > > > I have configured with 64-bit and then when I run, I got the following > error message: > > > *********************************************************************************************************************** > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [0]PETSC ERROR: Out of memory. This could be due to allocating > > [0]PETSC ERROR: too large an object or bleeding by not properly > > [0]PETSC ERROR: destroying unneeded objects. > > [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152 > > [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > > [0]PETSC ERROR: Memory requested 1614907707076 > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > > [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named > huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018 > > [0]PETSC ERROR: Configure options --download-hypre --download-mpich > --with-64-bit-indices > > [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in > /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > > [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in > /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c > > [0]PETSC ERROR: #3 PetscMallocA() line 397 in > /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c > > [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in > /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > > [0]PETSC ERROR: #5 VecSetType() line 51 in > /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c > > [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in > /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > > [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in > /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > > [0]PETSC ERROR: #8 main() line 57 in > /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c > > [0]PETSC ERROR: No PETSc Option Table entries > > [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > > application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0 > > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55 > > : > > system msg for write_line failure : Bad file descriptor > > > ************************************************************************************************************************************ > > > > Thanks. > > Sincerely, > > Huq > > > > On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. > wrote: > > > > > > > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > > > Hello PETSc Developers, > > > > > > I am trying to run the code (attached herewith) with the following > command and it works until the size of the matrix is 99999X99999. But when > I try to run with 999999X999999 then I got weird result. > > > > What is that "weird result"? > > > > My guess is for problems that large you need to ./configure PETSc > with the additional option --with-64-bit-indices since for such large > problems 32 bit integers are not large enough to contain values needed for > storing and accessing the sparse matrix. > > > > Barry > > > > > > > > The command is: > > > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg > -ksp_view_solution > > > Any suggestions is appreciated. > > > > > > Thanks. > > > Sincerely, > > > Huq > > > > > > > > > -- > > > > > > Fazlul Huq > > > Graduate Research Assistant > > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > > University of Illinois at Urbana-Champaign (UIUC) > > > E-mail: huq2090 at gmail.com > > > > > > > > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > > > > > -- > > > > Fazlul Huq > > Graduate Research Assistant > > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > > University of Illinois at Urbana-Champaign (UIUC) > > E-mail: huq2090 at gmail.com > > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From danyang.su at gmail.com Thu Nov 29 23:13:19 2018 From: danyang.su at gmail.com (Danyang Su) Date: Thu, 29 Nov 2018 21:13:19 -0800 Subject: [petsc-users] [petsc-maint] Fwd: DMPlex global to natural problem using DmPlexGetVertexNumbering or DMPlexGlobalToNatural In-Reply-To: References: <16c44759-2ab2-e9ed-2c03-98f39738caa2@gmail.com> <9dda31a1-8ec6-f571-1ceb-ed4a23426bac@gmail.com> Message-ID: On 18-11-29 06:13 PM, Matthew Knepley wrote: > On Thu, Nov 29, 2018 at 7:40 PM Danyang Su via petsc-maint > > wrote: > > Dear PETSc developers & users, > > Sorry to bother you again. I just encounter some difficulties in > DMPlex > global to natural order. This is not the very necessary function > in my > code. But it's best to have it in case someone wants to feed code > with > initial conditions or parameters from external file using natural > ordering. > > First I tried to use DmPlexGetVertexNumbering which seems pretty > straightforward, but I always get error saying "You need a ISO C > conforming compiler to use the glibc headers". I use Gfortran on > Linux. > > Then I switched to use Label to save global-natural order before > distributing the mesh, everything works fine. The only problem is > that > it takes a very long time during mesh distribution when the size > of mesh > is very big. > > Finally I tried to use GlobalToNatural ordering. Set Natural order to > TRUE before calling DMPlexDistribute and then call > DMPlexSetMigrationSF. > However, the function DMPlexGlobalToNaturalEnd seems doing nothing as > the returned vector is always unchanged. I got some help from Josh > who > has done similar work before. But I still cannot figure out what is > wrong in the code. > > > Can you run the example that Blaise recommended? Sometimes its easiest > to start > with an example and change it into your code. I cannot compile ex26 due to missing exodusII.h file. Error information is make ex26 /home/dsu/Soft/PETSc/petsc-3.10.2/linux-gnu-dbg/bin/mpicc -o ex26.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -g3 -I/home/dsu/Soft/PETSc/petsc-3.10.2/include -I/home/dsu/Soft/PETSc/petsc-3.10.2/linux-gnu-dbg/include `pwd`/ex26.c /home/dsu/Soft/PETSc/petsc-3.10.2/src/dm/impls/plex/examples/tests/ex26.c:10:22: fatal error: exodusII.h: No such file or directory compilation terminated. /home/dsu/Soft/PETSc/petsc-3.10.2/lib/petsc/conf/rules:359: recipe for target 'ex26.o' failed Anyway, I will look at this example to get familiar with GlobalToNatural related functions. Thanks, Danyang > > Thanks, > > Matt > > Attached is the code section with related functions included, > together > with a mesh and output of global vector and natural vector. If > something > is wrong in the code, it should be in these four functions. The > related > functions are called in the following order. > > call solver_dd_create_dmplex > > call solver_dd_mapping_set_dmplex > > call solver_dd_DMDACreate_flow > > !call solver_dd_DMDACreate_react Not used in current testing > > call solver_dd_mapping_global_natural > > I really appreciate your help. > > Regards, > > Danyang > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From barrydog505 at gmail.com Fri Nov 30 06:04:14 2018 From: barrydog505 at gmail.com (Tsung-Hsing Chen) Date: Fri, 30 Nov 2018 20:04:14 +0800 Subject: [petsc-users] How to get cell number from node number? Message-ID: Hi, Is there any function that can get cell number from the given node number exist already? Thank you, Barry -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Fri Nov 30 06:33:32 2018 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Fri, 30 Nov 2018 15:33:32 +0300 Subject: [petsc-users] How to get cell number from node number? In-Reply-To: References: Message-ID: Il giorno Ven 30 Nov 2018, 15:05 Tsung-Hsing Chen via petsc-users < petsc-users at mcs.anl.gov> ha scritto: > Hi, > Is there any function that can get cell number from the given node number > exist already? > Cell number of what? DMPLEX? And what is "node number"? > Thank you, > Barry > -------------- next part -------------- An HTML attachment was scrubbed... URL: From barrydog505 at gmail.com Fri Nov 30 07:18:04 2018 From: barrydog505 at gmail.com (barry) Date: Fri, 30 Nov 2018 21:18:04 +0800 Subject: [petsc-users] How to get cell number from node number? In-Reply-To: References: Message-ID: Sorry for the imprecisely description. I have an unstructured grid (DMPLEX) with triangle mesh. cell number (2 dim)={0};??? line number (1 dim)={1, 2, 3}; node number (0 dim)={4, 5, 6} Thank you, Barry On 11/30/18 8:33 PM, Stefano Zampini wrote: > > > Il giorno Ven 30 Nov 2018, 15:05 Tsung-Hsing Chen via petsc-users > > ha scritto: > > Hi, > Is there any function that can get cell number from the given node > number exist already? > > > Cell number of what? DMPLEX? And what is "node number"? > > > Thank you, > Barry > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 30 07:20:38 2018 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 30 Nov 2018 08:20:38 -0500 Subject: [petsc-users] RAW binary write In-Reply-To: References: Message-ID: On Fri, Nov 30, 2018 at 4:03 AM Sal Am wrote: > Hi Matthew, > > by raw I mean something the equivalent of pure C++ like > > std::fstream fout("Vector_b.bin",std::ios::out | std::ios::binary); > fout.write((char*)&b[i],sizeof(std::complex)); > fout.close(); //std::vector< std::complex > b > > i.e. without the PETSc format information, so I can use the resulting data > in other software. > That format would not let the user know how big the vector was. Moreover, other people would want other slightly different formats. The PETSc binary viewer is built from lower level PETSc functions which you could use to build whatever output you want. Thanks, Matt > Thanks > > > > On Thu, Nov 29, 2018 at 4:39 PM Matthew Knepley wrote: > >> On Thu, Nov 29, 2018 at 10:50 AM Sal Am via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> Is there a way to write the solution from the system Ax=b in raw binary >>> instead of PETSc binary format? >>> >> >> What is "raw binary". You have to have some format. >> >> Matt >> >> >>> Currently I am doing: >>> ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD, >>> "../../python/petscpy/Vector_x_petsc.dat", FILE_MODE_WRITE, >>> &viewer);CHKERRQ(ierr); >>> ierr = VecView(x,viewer);CHKERRQ(ierr);CHKERRQ(ierr); >>> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); >>> >>> And then use PetscBinaryIO to read it back and save it using >>> write('newformat', 'wb') to get to raw... however this approach is not good >>> it seems as there are some troubles with little/big endian when using the >>> resulting converted file on other systems for post-processing. >>> >>> Thanks, >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 30 07:28:07 2018 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 30 Nov 2018 08:28:07 -0500 Subject: [petsc-users] How to get cell number from node number? In-Reply-To: References: Message-ID: On Fri, Nov 30, 2018 at 8:19 AM barry via petsc-users < petsc-users at mcs.anl.gov> wrote: > Sorry for the imprecisely description. > > I have an unstructured grid (DMPLEX) with triangle mesh. > > cell number (2 dim)={0}; line number (1 dim)={1, 2, 3}; node number > (0 dim)={4, 5, 6} > > So, you have a 2D interpolated simplex mesh in DMPlex, which means it has vertices, edges, and cells. If you want the vertices on a given cell, you would get the closure and then filter out any points which are not vertices. For example, PetscInt *closure = NULL; PetscInt clSize, cl; ierr = DMPlexGetDepthStratum(dm, 0, &vStart, &vEnd);CHKERRQ(ierr); ierr = DMPlexGetTransitiveClosure(dm, cell, PETSC_TRUE, &clSize, &closure);CHKERRQ(ierr); for (cl = 0; cl < clSize*2; cl += 2) { const PetscInt point = closure[cl]; if (point >= vStart && point < vEnd) { } } ierr = DMPlexRestoreTransitiveClosure(dm, cell, PETSC_TRUE, &clSize, &closure);CHKERRQ(ierr); If you do not use edges, then its much easier. Just call DMPlexGetCone(). If you do use edges, but also get the vertices all the time, you can make an index for this query, which will speed it up greatly at the cost of some memory. Thanks, Matt > Thank you, > > Barry > On 11/30/18 8:33 PM, Stefano Zampini wrote: > > > > Il giorno Ven 30 Nov 2018, 15:05 Tsung-Hsing Chen via petsc-users < > petsc-users at mcs.anl.gov> ha scritto: > >> Hi, >> Is there any function that can get cell number from the given node number >> exist already? >> > > Cell number of what? DMPLEX? And what is "node number"? > > >> Thank you, >> Barry >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Fri Nov 30 08:40:11 2018 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Fri, 30 Nov 2018 17:40:11 +0300 Subject: [petsc-users] How to get cell number from node number? In-Reply-To: References: Message-ID: <3C550DA5-0797-4DF1-AE6F-9C0769219B86@gmail.com> Just to add to what Matt wrote: it seems your question is to get the cell numbers given a node (=vertex) numbering In general, there are many cells that share the same vertex. If you have a vertex id (vertex), you can traverse the cells that share that vertex by still using DMPlexGetTransitiveClosure, with a PETSC_FALSE argument to get the so-called ?star? of the point ierr = DMPlexGetHeightStratum(dm, 0, &cStart, &cEnd);CHKERRQ(ierr); ierr = DMPlexGetTransitiveClosure(dm, vertex, PETSC_FALSE, &clSize, &closure);CHKERRQ(ierr); for (cl = 0; cl < clSize*2; cl += 2) { const PetscInt point = closure[cl]; if (point >= cStart && point < cEnd) { } } ierr = DMPlexRestoreTransitiveClosure(dm, vertex, PETSC_FALSE, &clSize, &closure);CHKERRQ(ierr); > On Nov 30, 2018, at 4:28 PM, Matthew Knepley wrote: > > On Fri, Nov 30, 2018 at 8:19 AM barry via petsc-users > wrote: > Sorry for the imprecisely description. > I have an unstructured grid (DMPLEX) with triangle mesh. > > cell number (2 dim)={0}; line number (1 dim)={1, 2, 3}; node number (0 dim)={4, 5, 6} > > > So, you have a 2D interpolated simplex mesh in DMPlex, which means it has vertices, edges, and cells. > If you want the vertices on a given cell, you would get the closure and then filter out any points which are > not vertices. For example, > > PetscInt *closure = NULL; > PetscInt clSize, cl; > > ierr = DMPlexGetDepthStratum(dm, 0, &vStart, &vEnd);CHKERRQ(ierr); > ierr = DMPlexGetTransitiveClosure(dm, cell, PETSC_TRUE, &clSize, &closure);CHKERRQ(ierr); > for (cl = 0; cl < clSize*2; cl += 2) { > const PetscInt point = closure[cl]; > > if (point >= vStart && point < vEnd) { > > } > } > ierr = DMPlexRestoreTransitiveClosure(dm, cell, PETSC_TRUE, &clSize, &closure);CHKERRQ(ierr); > > If you do not use edges, then its much easier. Just call DMPlexGetCone(). If you do use edges, but also > get the vertices all the time, you can make an index for this query, which will speed it up greatly at the > cost of some memory. > > Thanks, > > Matt > Thank you, > > Barry > > On 11/30/18 8:33 PM, Stefano Zampini wrote: >> >> >> Il giorno Ven 30 Nov 2018, 15:05 Tsung-Hsing Chen via petsc-users > ha scritto: >> Hi, >> Is there any function that can get cell number from the given node number exist already? >> >> Cell number of what? DMPLEX? And what is "node number"? >> >> >> Thank you, >> Barry > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From barrydog505 at gmail.com Fri Nov 30 08:43:19 2018 From: barrydog505 at gmail.com (barry) Date: Fri, 30 Nov 2018 22:43:19 +0800 Subject: [petsc-users] How to get cell number from node number? In-Reply-To: References: Message-ID: <0937a998-af15-0fc1-bfe5-0171b343bd12@gmail.com> Thanks! On 11/30/18 9:28 PM, Matthew Knepley wrote: > On Fri, Nov 30, 2018 at 8:19 AM barry via petsc-users > > wrote: > > Sorry for the imprecisely description. > > I have an unstructured grid (DMPLEX) with triangle mesh. > > cell number (2 dim)={0};??? line number (1 dim)={1, 2, 3};??? node > number (0 dim)={4, 5, 6} > > So, you have a 2D interpolated simplex mesh in DMPlex, which means it > has vertices, edges, and cells. > If you want the vertices on a given cell, you would get the closure > and then filter out any points which are > not vertices. For example, > > PetscInt *closure = NULL; > PetscInt? clSize, cl; > > ierr = DMPlexGetDepthStratum(dm, 0, &vStart, &vEnd);CHKERRQ(ierr); > ierr = DMPlexGetTransitiveClosure(dm, cell, PETSC_TRUE, &clSize, > &closure);CHKERRQ(ierr); > for (cl = 0; cl < clSize*2; cl += 2) { > ? const PetscInt point = closure[cl]; > > ? if (point >= vStart && point < vEnd) { > ? ? > ? } > } > ierr = DMPlexRestoreTransitiveClosure(dm, cell, PETSC_TRUE, &clSize, > &closure);CHKERRQ(ierr); > > If you do not use edges, then its much easier. Just call > DMPlexGetCone(). If you do use edges, but also > get the vertices all the time, you can make an index for this query, > which will speed it up greatly at the > cost of some memory. > > ? Thanks, > > ? ? ?Matt > > Thank you, > > Barry > > On 11/30/18 8:33 PM, Stefano Zampini wrote: >> >> >> Il giorno Ven 30 Nov 2018, 15:05 Tsung-Hsing Chen via petsc-users >> > ha >> scritto: >> >> Hi, >> Is there any function that can get cell number from the given >> node number exist already? >> >> >> Cell number of what? DMPLEX? And what is "node number"? >> >> >> Thank you, >> Barry >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From barrydog505 at gmail.com Fri Nov 30 08:43:56 2018 From: barrydog505 at gmail.com (barry) Date: Fri, 30 Nov 2018 22:43:56 +0800 Subject: [petsc-users] How to get cell number from node number? In-Reply-To: <3C550DA5-0797-4DF1-AE6F-9C0769219B86@gmail.com> References: <3C550DA5-0797-4DF1-AE6F-9C0769219B86@gmail.com> Message-ID: <172664ce-0b91-d6eb-eeae-f90564a0f0c8@gmail.com> Thanks, I'll try it. Barry On 11/30/18 10:40 PM, Stefano Zampini wrote: > Just to add to what Matt wrote: it seems your question is to get the > cell numbers ?given a node (=vertex) numbering > In general, there are many cells that share the same vertex. If you > have a vertex id (vertex), you can traverse the cells that share that > vertex by still using DMPlexGetTransitiveClosure, with a PETSC_FALSE > argument to get the so-called ?star? of the point > > ierr = DMPlexGetHeightStratum(dm, 0, &cStart, &cEnd);CHKERRQ(ierr); > ierr = DMPlexGetTransitiveClosure(dm, vertex, PETSC_FALSE, &clSize, > &closure);CHKERRQ(ierr); > for (cl = 0; cl < clSize*2; cl += 2) { > ??const PetscInt point = closure[cl]; > > ? if (point >= cStart && point < cEnd) { > ?? ? > ??} > } > ierr = DMPlexRestoreTransitiveClosure(dm, vertex, PETSC_FALSE, > &clSize, &closure);CHKERRQ(ierr); > > >> On Nov 30, 2018, at 4:28 PM, Matthew Knepley > > wrote: >> >> On Fri, Nov 30, 2018 at 8:19 AM barry via petsc-users >> > wrote: >> >> Sorry for the imprecisely description. >> >> I have an unstructured grid (DMPLEX) with triangle mesh. >> >> cell number (2 dim)={0};??? line number (1 dim)={1, 2, 3};??? >> node number (0 dim)={4, 5, 6} >> >> >> So, you have a 2D interpolated simplex mesh in DMPlex, which means it >> has vertices, edges, and cells. >> If you want the vertices on a given cell, you would get the closure >> and then filter out any points which are >> not vertices. For example, >> >> PetscInt *closure = NULL; >> PetscInt? clSize, cl; >> >> ierr = DMPlexGetDepthStratum(dm, 0, &vStart, &vEnd);CHKERRQ(ierr); >> ierr = DMPlexGetTransitiveClosure(dm, cell, PETSC_TRUE, &clSize, >> &closure);CHKERRQ(ierr); >> for (cl = 0; cl < clSize*2; cl += 2) { >> ? const PetscInt point = closure[cl]; >> >> ? if (point >= vStart && point < vEnd) { >> ? ? >> ? } >> } >> ierr = DMPlexRestoreTransitiveClosure(dm, cell, PETSC_TRUE, &clSize, >> &closure);CHKERRQ(ierr); >> >> If you do not use edges, then its much easier. Just call >> DMPlexGetCone(). If you do use edges, but also >> get the vertices all the time, you can make an index for this query, >> which will speed it up greatly at the >> cost of some memory. >> >> ? Thanks, >> >> ? ? ?Matt >> >> Thank you, >> >> Barry >> >> On 11/30/18 8:33 PM, Stefano Zampini wrote: >>> >>> >>> Il giorno Ven 30 Nov 2018, 15:05 Tsung-Hsing Chen via >>> petsc-users >> > ha scritto: >>> >>> Hi, >>> Is there any function that can get cell number from the >>> given node number exist already? >>> >>> >>> Cell number of what? DMPLEX? And what is "node number"? >>> >>> >>> Thank you, >>> Barry >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which >> their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alice.raeli at polito.it Fri Nov 30 08:50:48 2018 From: alice.raeli at polito.it (RAELI ALICE) Date: Fri, 30 Nov 2018 15:50:48 +0100 Subject: [petsc-users] PETSC address vector c++ access Message-ID: An HTML attachment was scrubbed... URL: From benoit.nennig at supmeca.fr Fri Nov 30 09:18:50 2018 From: benoit.nennig at supmeca.fr (NENNIG Benoit) Date: Fri, 30 Nov 2018 15:18:50 +0000 Subject: [petsc-users] create a block matrix from existing petsc matrices and/or vectors Message-ID: <1543591130109.22345@supmeca.fr> Dear petsc users, I have parallel matrix A (mpiaij) and I would like to create a matrix B like B = [A v wT 0 ] where wT and v are vectors. A is involved in eigenvalue computation (slepc4py) and B will be used by a direct solver. Currently, the matrix A is built from a sparse scipy matrix thanks to A.createAIJ, but I would like to avoid to create B from scipy. I am looking for a more _petscic_ way to solve this problem. I have read that it seems not possible to change a matrix structure after its assembly. However - Is there a way to build directly in petsc this kind of block matrix B to reuse A ? - Is there a way to convert the petsc matrix A into csr form and to create B as new petsc matrix ? - Is there a way to copy a matrix into another one with a different structure ? - ... My second question is similar, suppose I have three matrices A1, A2, C resulting form coupled 2 FEM problems, what is the good way to create the coupled global system B = [A1 C CT A2] to use it with direct solver. Thanks a lot, Benoit From tempohoper at gmail.com Fri Nov 30 03:03:42 2018 From: tempohoper at gmail.com (Sal Am) Date: Fri, 30 Nov 2018 09:03:42 +0000 Subject: [petsc-users] RAW binary write In-Reply-To: References: Message-ID: Hi Matthew, by raw I mean something the equivalent of pure C++ like std::fstream fout("Vector_b.bin",std::ios::out | std::ios::binary); fout.write((char*)&b[i],sizeof(std::complex)); fout.close(); //std::vector< std::complex > b i.e. without the PETSc format information, so I can use the resulting data in other software. Thanks On Thu, Nov 29, 2018 at 4:39 PM Matthew Knepley wrote: > On Thu, Nov 29, 2018 at 10:50 AM Sal Am via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Is there a way to write the solution from the system Ax=b in raw binary >> instead of PETSc binary format? >> > > What is "raw binary". You have to have some format. > > Matt > > >> Currently I am doing: >> ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD, >> "../../python/petscpy/Vector_x_petsc.dat", FILE_MODE_WRITE, >> &viewer);CHKERRQ(ierr); >> ierr = VecView(x,viewer);CHKERRQ(ierr);CHKERRQ(ierr); >> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); >> >> And then use PetscBinaryIO to read it back and save it using >> write('newformat', 'wb') to get to raw... however this approach is not good >> it seems as there are some troubles with little/big endian when using the >> resulting converted file on other systems for post-processing. >> >> Thanks, >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Fri Nov 30 10:46:21 2018 From: jed at jedbrown.org (Jed Brown) Date: Fri, 30 Nov 2018 11:46:21 -0500 Subject: [petsc-users] Problem with large grid size In-Reply-To: <4F8C4EDE-FA4E-4981-9079-A32C7325F212@mcs.anl.gov> References: <4F8C4EDE-FA4E-4981-9079-A32C7325F212@mcs.anl.gov> Message-ID: <87k1kuikde.fsf@jedbrown.org> "Smith, Barry F. via petsc-users" writes: > You need to run it on more processors, this one processor doesn't have enough memory to fit the vectors (which by the way are huge 1,614,907,707,076) This is just a tridiagonal problem; I don't know why the vectors would be huge when the problem dimension is only a million. Fazlul, is this your target problem? The tridiagonal problem can be solved trivially using Cholesky (-pc_type cholesky; or LU) -- the n=10^7 case takes perhaps a second in serial (and this can be made faster). The convergence is slow with local preconditioners because information travels only one element at a time. Note that I don't get any memory-related error messages with a 32-bit build of PETSc, but perhaps I'm not running the same way as you are. The errors reported by your code are wrong because your stated exact solution and RHS are not actually compatible. > Barry > > >> On Nov 29, 2018, at 8:26 PM, Fazlul Huq wrote: >> >> Thanks. >> >> I have configured with 64-bit and then when I run, I got the following error message: >> *********************************************************************************************************************** >> [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- >> [0]PETSC ERROR: Out of memory. This could be due to allocating >> [0]PETSC ERROR: too large an object or bleeding by not properly >> [0]PETSC ERROR: destroying unneeded objects. >> [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152 >> [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. >> [0]PETSC ERROR: Memory requested 1614907707076 >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 >> [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018 >> [0]PETSC ERROR: Configure options --download-hypre --download-mpich --with-64-bit-indices >> [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c >> [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c >> [0]PETSC ERROR: #3 PetscMallocA() line 397 in /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c >> [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c >> [0]PETSC ERROR: #5 VecSetType() line 51 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c >> [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c >> [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c >> [0]PETSC ERROR: #8 main() line 57 in /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c >> [0]PETSC ERROR: No PETSc Option Table entries >> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- >> application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0 >> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55 >> : >> system msg for write_line failure : Bad file descriptor >> ************************************************************************************************************************************ >> >> Thanks. >> Sincerely, >> Huq >> >> On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. wrote: >> >> >> > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users wrote: >> > >> > Hello PETSc Developers, >> > >> > I am trying to run the code (attached herewith) with the following command and it works until the size of the matrix is 99999X99999. But when I try to run with 999999X999999 then I got weird result. >> >> What is that "weird result"? >> >> My guess is for problems that large you need to ./configure PETSc with the additional option --with-64-bit-indices since for such large problems 32 bit integers are not large enough to contain values needed for storing and accessing the sparse matrix. >> >> Barry >> >> > >> > The command is: >> > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg -ksp_view_solution >> > Any suggestions is appreciated. >> > >> > Thanks. >> > Sincerely, >> > Huq >> > >> > >> > -- >> > >> > Fazlul Huq >> > Graduate Research Assistant >> > Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> > University of Illinois at Urbana-Champaign (UIUC) >> > E-mail: huq2090 at gmail.com >> > >> >> >> >> -- >> >> Fazlul Huq >> Graduate Research Assistant >> Department of Nuclear, Plasma & Radiological Engineering (NPRE) >> University of Illinois at Urbana-Champaign (UIUC) >> E-mail: huq2090 at gmail.com From knepley at gmail.com Fri Nov 30 10:53:57 2018 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 30 Nov 2018 11:53:57 -0500 Subject: [petsc-users] PETSC address vector c++ access In-Reply-To: References: Message-ID: On Fri, Nov 30, 2018 at 9:50 AM RAELI ALICE via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hi All, > My team is working on a PETSC version of an existent code. > In order to convert the main part of this work retaining the c++ levels of > abstraction, > we would access to c++ vector data structures easily. > > We would like to use the memory area allocated using c++ as a Petsc Vector > > a supposed pseudocode could be: > > > > > *int data[] = { 1,2,3,4,5,6,7,8,9 };unsigned int sizeVectorPetsC = > data.size();Vec v(data, sizeVectorPetsC); (Can it exists with easy access > petsc routines?)* > *cout<< "The mapped vector v is: [";* > > * PetscScalar *vecArray;* > > > > > > > *PetscGetArray(v, &array);for (unsigned int i = 0; i < sizeVectorPetsC; > i++){Informations of v are the informations of data.(same memory read by > both data structures)}PetscRestoreArray(v,&array);* > * cout<< "]"<< endl;* > > > Is it provided this kind of duality between c++ and Petsc objects without > a re-copy of the concerned informations? > Yes, I think you want https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecCreateSeqWithArray.html https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecCreateMPIWithArray.html#VecCreateMPIWithArray and you can even do advanced things with the storage https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecPlaceArray.html#VecPlaceArray but you need to be more careful. Let me know if this does not do what you wanted. Thanks, Matt > Thank you, > Alice > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From huq2090 at gmail.com Fri Nov 30 11:27:20 2018 From: huq2090 at gmail.com (Fazlul Huq) Date: Fri, 30 Nov 2018 11:27:20 -0600 Subject: [petsc-users] Problem with large grid size In-Reply-To: <87k1kuikde.fsf@jedbrown.org> References: <4F8C4EDE-FA4E-4981-9079-A32C7325F212@mcs.anl.gov> <87k1kuikde.fsf@jedbrown.org> Message-ID: Thanks. Now I can run up to n=10^7 with -pc_type cholesky as well as -pc_type hypre -pc_hypre_type boomeramg Actually, I trying to find how much time and FLOPs count does it take to solve a problem using multigrid for different sizes of the problem. The problem I am trying to solve is attached herewith. I am not quiet sure whether the way I have set the boundary conditions are correct or not! (Because, BCs are given for x0 and xn. I am solving for x1 to x(n-1). But setting BCs of x0 and xn) This would be a great help for me if you can refer me some books so that I can go through to make my understanding more clear. Thanks. Sincerely, Huq On Fri, Nov 30, 2018 at 10:46 AM Jed Brown wrote: > "Smith, Barry F. via petsc-users" writes: > > > You need to run it on more processors, this one processor doesn't > have enough memory to fit the vectors (which by the way are huge > 1,614,907,707,076) > > This is just a tridiagonal problem; I don't know why the vectors would > be huge when the problem dimension is only a million. > > Fazlul, is this your target problem? The tridiagonal problem can be > solved trivially using Cholesky (-pc_type cholesky; or LU) -- the n=10^7 > case takes perhaps a second in serial (and this can be made faster). > The convergence is slow with local preconditioners because information > travels only one element at a time. Note that I don't get any > memory-related error messages with a 32-bit build of PETSc, but perhaps > I'm not running the same way as you are. > > The errors reported by your code are wrong because your stated exact > solution and RHS are not actually compatible. > > > Barry > > > > > >> On Nov 29, 2018, at 8:26 PM, Fazlul Huq wrote: > >> > >> Thanks. > >> > >> I have configured with 64-bit and then when I run, I got the following > error message: > >> > *********************************************************************************************************************** > >> [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > >> [0]PETSC ERROR: Out of memory. This could be due to allocating > >> [0]PETSC ERROR: too large an object or bleeding by not properly > >> [0]PETSC ERROR: destroying unneeded objects. > >> [0]PETSC ERROR: Memory allocated 41632 Memory used by process 13361152 > >> [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > >> [0]PETSC ERROR: Memory requested 1614907707076 > >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > >> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > >> [0]PETSC ERROR: ./poisson_m on a arch-linux2-c-debug named > huq2090-XPS-15-9570 by huq2090 Thu Nov 29 20:24:13 2018 > >> [0]PETSC ERROR: Configure options --download-hypre --download-mpich > --with-64-bit-indices > >> [0]PETSC ERROR: #1 VecCreate_Seq() line 35 in > /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > >> [0]PETSC ERROR: #2 PetscTrMallocDefault() line 183 in > /home/huq2090/petsc-3.10.2/src/sys/memory/mtr.c > >> [0]PETSC ERROR: #3 PetscMallocA() line 397 in > /home/huq2090/petsc-3.10.2/src/sys/memory/mal.c > >> [0]PETSC ERROR: #4 VecCreate_Seq() line 35 in > /home/huq2090/petsc-3.10.2/src/vec/vec/impls/seq/bvec3.c > >> [0]PETSC ERROR: #5 VecSetType() line 51 in > /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vecreg.c > >> [0]PETSC ERROR: #6 VecSetTypeFromOptions_Private() line 1250 in > /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > >> [0]PETSC ERROR: #7 VecSetFromOptions() line 1284 in > /home/huq2090/petsc-3.10.2/src/vec/vec/interface/vector.c > >> [0]PETSC ERROR: #8 main() line 57 in > /home/huq2090/petsc-3.10.2/problems/ksp/poisson_m.c > >> [0]PETSC ERROR: No PETSc Option Table entries > >> [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > >> application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0 > >> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=55 > >> : > >> system msg for write_line failure : Bad file descriptor > >> > ************************************************************************************************************************************ > >> > >> Thanks. > >> Sincerely, > >> Huq > >> > >> On Thu, Nov 29, 2018 at 1:51 PM Smith, Barry F. > wrote: > >> > >> > >> > On Nov 29, 2018, at 1:46 PM, Fazlul Huq via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> > > >> > Hello PETSc Developers, > >> > > >> > I am trying to run the code (attached herewith) with the following > command and it works until the size of the matrix is 99999X99999. But when > I try to run with 999999X999999 then I got weird result. > >> > >> What is that "weird result"? > >> > >> My guess is for problems that large you need to ./configure PETSc > with the additional option --with-64-bit-indices since for such large > problems 32 bit integers are not large enough to contain values needed for > storing and accessing the sparse matrix. > >> > >> Barry > >> > >> > > >> > The command is: > >> > ./poisson_m -n 999999 -pc_type hypre -pc_hypre_type boomeramg > -ksp_view_solution > >> > Any suggestions is appreciated. > >> > > >> > Thanks. > >> > Sincerely, > >> > Huq > >> > > >> > > >> > -- > >> > > >> > Fazlul Huq > >> > Graduate Research Assistant > >> > Department of Nuclear, Plasma & Radiological Engineering (NPRE) > >> > University of Illinois at Urbana-Champaign (UIUC) > >> > E-mail: huq2090 at gmail.com > >> > > >> > >> > >> > >> -- > >> > >> Fazlul Huq > >> Graduate Research Assistant > >> Department of Nuclear, Plasma & Radiological Engineering (NPRE) > >> University of Illinois at Urbana-Champaign (UIUC) > >> E-mail: huq2090 at gmail.com > -- Fazlul Huq Graduate Research Assistant Department of Nuclear, Plasma & Radiological Engineering (NPRE) University of Illinois at Urbana-Champaign (UIUC) E-mail: huq2090 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: poisson_problem.jpg Type: image/jpeg Size: 2321815 bytes Desc: not available URL: From sajidsyed2021 at u.northwestern.edu Fri Nov 30 13:30:06 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Fri, 30 Nov 2018 13:30:06 -0600 Subject: [petsc-users] Error in vec/vec/examples/tutorials/ex19.c Message-ID: Hi, I tried running ex19.c and I get the following error (I added a small snippet to print the local size on rank0 as well) : [sajid at xrm temp]$ mpirun -np 2 ./ex19 local_sizes : 3 4 [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Arguments are incompatible [1]PETSC ERROR: Incompatible vector local lengths parameter # 1 local size 3 != parameter # 2 local size 2 [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown [1]PETSC ERROR: ./ex19 on a named xrm by sajid Fri Nov 30 13:28:36 2018 [1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Arguments are incompatible [0]PETSC ERROR: Incompatible vector local lengths parameter # 1 local size 3 != parameter # 2 local size 4 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown [0]PETSC ERROR: ./ex19 on a named xrm by sajid Fri Nov 30 13:28:36 2018 Does this mean that PETSC_DECIDE is inconsistent between two calls in the same file ? Thank You, Sajid Ali Applied Physics Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Fri Nov 30 13:40:34 2018 From: dave.mayhem23 at gmail.com (Dave May) Date: Fri, 30 Nov 2018 19:40:34 +0000 Subject: [petsc-users] PETSC address vector c++ access In-Reply-To: References: Message-ID: On Fri, 30 Nov 2018 at 14:50, RAELI ALICE via petsc-users < petsc-users at mcs.anl.gov> wrote: > Hi All, > My team is working on a PETSC version of an existent code. > In order to convert the main part of this work retaining the c++ levels of > abstraction, > we would access to c++ vector data structures easily. > > We would like to use the memory area allocated using c++ as a Petsc Vector > > a supposed pseudocode could be: > > > *int data[] = { 1,2,3,4,5,6,7,8,9 };* > To avoid any confusion (or disappointment) later, I want to point out that your pseudo code is wrong (specifically the line above) and in detail would result in a SEGV. A PETSc Vec object can ONLY represent PetscScalar data types. It may well have been a typo on your part, but I think it's important to emphasise that the Vec object does not try to mimic a C++ template with the data type as an argument. Thanks, Dave > > > *unsigned int sizeVectorPetsC = data.size();Vec v(data, sizeVectorPetsC); > (Can it exists with easy access petsc routines?)* > *cout<< "The mapped vector v is: [";* > > * PetscScalar *vecArray;* > > > > > > > *PetscGetArray(v, &array);for (unsigned int i = 0; i < sizeVectorPetsC; > i++){Informations of v are the informations of data.(same memory read by > both data structures)}PetscRestoreArray(v,&array);* > * cout<< "]"<< endl;* > > > Is it provided this kind of duality between c++ and Petsc objects without > a re-copy of the concerned informations? > > Thank you, > Alice > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajidsyed2021 at u.northwestern.edu Fri Nov 30 16:27:02 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Fri, 30 Nov 2018 16:27:02 -0600 Subject: [petsc-users] Load dense matrices from hdf5 Message-ID: Hi, I'm trying to solve the Helmholtz equation in 1D for x-rays which roughly like : u_dot = a*u_xx + h(t)*u I've already implemented the case where h(t) is always zero (free-space) in PETSc as per the last box on page 156 of the manual. For the time dependent version of the same the next step is to modify the RHSMatrix by adding h(t) via MatDiagonalSet&Insert. (With h(t)=0, the RHS matrix is just a (scaled) Laplacian matrix) The h(t) is related to x-ray refractive index at that time. These x-ray indices need to be (preferably) stored in matrix beforehand. Each column of the matrix would be the refractive index to be used in one iteration. Using hdf5 is something I'd like to do because I can make/view the grid using numpy in python and solve the PDE using PETSc in C. PS: I've seen past questions on this thread strongly discouraging users from storing dense matrices in hdf5 format. PPS : This 1D x-ray scattering problem is just a stepping stone to doing the same in 2D so if there's a better approach to adopt, I'm be open to new ideas. Thank You, Sajid Ali Applied Physics Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Nov 30 16:47:31 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 30 Nov 2018 22:47:31 +0000 Subject: [petsc-users] Load dense matrices from hdf5 In-Reply-To: References: Message-ID: <55E8F1B5-8FB9-4889-A2D8-EE2F65E3B56B@anl.gov> I would start by simply using the PETSc binary dense format for the matrices; it will work find for 1d (and 2d) development work with dozens of processes. Latter, if need be, you can switch to the HDF5, the only difference in your code would be the viewer you MatLoad() from will be an HDF5 file instead of a PETSc binary file. The format of a PETSc binary dense file is 4 integer values (the header) followed by the matrix as one contiguous array of doubles (stored by row). The header is of the form MAT_FILE_CLASSID, M, N, -1 (where M and N are number of rows and columns in the matrix.) You can easily create this file from Python (which I assume is where you are getting your h(t) values from). Barry > On Nov 30, 2018, at 4:27 PM, Sajid Ali via petsc-users wrote: > > Hi, > > I'm trying to solve the Helmholtz equation in 1D for x-rays which roughly like : > > u_dot = a*u_xx + h(t)*u > > I've already implemented the case where h(t) is always zero (free-space) in PETSc as per the last box on page 156 of the manual. > > For the time dependent version of the same the next step is to modify the RHSMatrix by adding h(t) via MatDiagonalSet&Insert. (With h(t)=0, the RHS matrix is just a (scaled) Laplacian matrix) > > The h(t) is related to x-ray refractive index at that time. These x-ray indices need to be (preferably) stored in matrix beforehand. Each column of the matrix would be the refractive index to be used in one iteration. Using hdf5 is something I'd like to do because I can make/view the grid using numpy in python and solve the PDE using PETSc in C. > > PS: I've seen past questions on this thread strongly discouraging users from storing dense matrices in hdf5 format. > > PPS : This 1D x-ray scattering problem is just a stepping stone to doing the same in 2D so if there's a better approach to adopt, I'm be open to new ideas. > > Thank You, > Sajid Ali > Applied Physics > Northwestern University From sajidsyed2021 at u.northwestern.edu Fri Nov 30 16:59:11 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Fri, 30 Nov 2018 16:59:11 -0600 Subject: [petsc-users] Load dense matrices from hdf5 In-Reply-To: <55E8F1B5-8FB9-4889-A2D8-EE2F65E3B56B@anl.gov> References: <55E8F1B5-8FB9-4889-A2D8-EE2F65E3B56B@anl.gov> Message-ID: Thanks for the advice! I'll look into using the binary format then. On Fri, Nov 30, 2018 at 4:47 PM Smith, Barry F. wrote: > > I would start by simply using the PETSc binary dense format for the > matrices; it will work find for 1d (and 2d) development work with dozens of > processes. Latter, if need be, you can switch to the HDF5, the only > difference in your code would be the viewer you MatLoad() from will be an > HDF5 file instead of a PETSc binary file. > > The format of a PETSc binary dense file is > > 4 integer values (the header) followed by the matrix as one contiguous > array of doubles (stored by row). The header is of the form > > MAT_FILE_CLASSID, M, N, -1 (where M and N are number of rows and > columns in the matrix.) > > You can easily create this file from Python (which I assume is where > you are getting your h(t) values from). > > Barry > > > > On Nov 30, 2018, at 4:27 PM, Sajid Ali via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Hi, > > > > I'm trying to solve the Helmholtz equation in 1D for x-rays which > roughly like : > > > > u_dot = a*u_xx + h(t)*u > > > > I've already implemented the case where h(t) is always zero (free-space) > in PETSc as per the last box on page 156 of the manual. > > > > For the time dependent version of the same the next step is to modify > the RHSMatrix by adding h(t) via MatDiagonalSet&Insert. (With h(t)=0, the > RHS matrix is just a (scaled) Laplacian matrix) > > > > The h(t) is related to x-ray refractive index at that time. These x-ray > indices need to be (preferably) stored in matrix beforehand. Each column of > the matrix would be the refractive index to be used in one iteration. Using > hdf5 is something I'd like to do because I can make/view the grid using > numpy in python and solve the PDE using PETSc in C. > > > > PS: I've seen past questions on this thread strongly discouraging users > from storing dense matrices in hdf5 format. > > > > PPS : This 1D x-ray scattering problem is just a stepping stone to doing > the same in 2D so if there's a better approach to adopt, I'm be open to new > ideas. > > > > Thank You, > > Sajid Ali > > Applied Physics > > Northwestern University > > -- Sajid Ali Applied Physics Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajidsyed2021 at u.northwestern.edu Fri Nov 30 17:04:32 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Fri, 30 Nov 2018 17:04:32 -0600 Subject: [petsc-users] Load dense matrices from hdf5 In-Reply-To: References: <55E8F1B5-8FB9-4889-A2D8-EE2F65E3B56B@anl.gov> Message-ID: If the matrix is filled with complex numbers is each complex number stored as a sequences of doubles ? Or is it better to split the matrix into real/imaginary and store each part separately? -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Nov 30 17:10:08 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 30 Nov 2018 23:10:08 +0000 Subject: [petsc-users] Load dense matrices from hdf5 In-Reply-To: References: <55E8F1B5-8FB9-4889-A2D8-EE2F65E3B56B@anl.gov> Message-ID: I assume you are ./configure PETSc with --with-scalar-type=complex? If so, the values in the file are the real and imaginary parts interlaced: That is r0 i0 r1 i1 .... where r0 is the first matrix entry's real part and i0 is the first matrix imaginary part. Barry > On Nov 30, 2018, at 5:04 PM, Sajid Ali wrote: > > If the matrix is filled with complex numbers is each complex number stored as a sequences of doubles ? Or is it better to split the matrix into real/imaginary and store each part separately? From bsmith at mcs.anl.gov Fri Nov 30 17:17:12 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 30 Nov 2018 23:17:12 +0000 Subject: [petsc-users] Error in vec/vec/examples/tutorials/ex19.c In-Reply-To: References: Message-ID: > On Nov 30, 2018, at 1:30 PM, Sajid Ali via petsc-users wrote: > > Hi, > > I tried running ex19.c and I get the following error (I added a small snippet to print the local size on rank0 as well) : > > [sajid at xrm temp]$ mpirun -np 2 ./ex19 > local_sizes : 3 4 > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Arguments are incompatible > [1]PETSC ERROR: Incompatible vector local lengths parameter # 1 local size 3 != parameter # 2 local size 2 > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [1]PETSC ERROR: ./ex19 on a named xrm by sajid Fri Nov 30 13:28:36 2018 > [1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Arguments are incompatible > [0]PETSC ERROR: Incompatible vector local lengths parameter # 1 local size 3 != parameter # 2 local size 4 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > [0]PETSC ERROR: ./ex19 on a named xrm by sajid Fri Nov 30 13:28:36 2018 > > Does this mean that PETSC_DECIDE is inconsistent between two calls in the same file ? No. The reason for the error is a bug in the code. The x1 vector has a block size of 1 hence when it is divided among two processes it gets 3 entries on each process. The x2 vector has a block size of 2 so when it is divided among two processes it gets 4 entries on the first process and 2 on the second. The call to VecCopy() does not work because the two vectors have different layouts. I have attached a fixed version of the code (that does not copy from x1 to x2). Barry > > Thank You, > Sajid Ali > Applied Physics > Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ex19.c Type: application/octet-stream Size: 5225 bytes Desc: ex19.c URL: From sajidsyed2021 at u.northwestern.edu Fri Nov 30 17:30:24 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Fri, 30 Nov 2018 17:30:24 -0600 Subject: [petsc-users] Load dense matrices from hdf5 In-Reply-To: References: <55E8F1B5-8FB9-4889-A2D8-EE2F65E3B56B@anl.gov> Message-ID: I use spack variants which do the same thing behind the scenes. Thanks ! On Fri, Nov 30, 2018 at 5:10 PM Smith, Barry F. wrote: > > I assume you are ./configure PETSc with --with-scalar-type=complex? If > so, the values in the file are the real and imaginary parts interlaced: > That is > r0 i0 r1 i1 .... where r0 is the first matrix entry's real part and i0 > is the first matrix imaginary part. > > Barry > > > > On Nov 30, 2018, at 5:04 PM, Sajid Ali > wrote: > > > > If the matrix is filled with complex numbers is each complex number > stored as a sequences of doubles ? Or is it better to split the matrix into > real/imaginary and store each part separately? > > -- Sajid Ali Applied Physics Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajidsyed2021 at u.northwestern.edu Fri Nov 30 17:35:38 2018 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Fri, 30 Nov 2018 17:35:38 -0600 Subject: [petsc-users] Error in vec/vec/examples/tutorials/ex19.c In-Reply-To: References: Message-ID: I thought I fixed it by using VecDuplicate instead of setting up x2 & copying but I got more confused as to why the VecWrite had different dimensions in each case. Does blocking take a Nx1 vector and convert it to N/BLOCK_SIZE x BLOCK_SIZE vector (that's how it looked it in the output hdf5 files) ? Since I didn't find TimeStep to be a feature of HDF5, am I right in assuming that it's a PETSc feature that creates an extra dimension to store multiple vectors (with the new dimension being indexed by PetscViewerHDF5SetTimeStep) ? On Fri, Nov 30, 2018 at 5:17 PM Smith, Barry F. wrote: > > > > On Nov 30, 2018, at 1:30 PM, Sajid Ali via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Hi, > > > > I tried running ex19.c and I get the following error (I added a small > snippet to print the local size on rank0 as well) : > > > > [sajid at xrm temp]$ mpirun -np 2 ./ex19 > > local_sizes : 3 4 > > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [1]PETSC ERROR: Arguments are incompatible > > [1]PETSC ERROR: Incompatible vector local lengths parameter # 1 local > size 3 != parameter # 2 local size 2 > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > > [1]PETSC ERROR: ./ex19 on a named xrm by sajid Fri Nov 30 13:28:36 2018 > > [1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [0]PETSC ERROR: Arguments are incompatible > > [0]PETSC ERROR: Incompatible vector local lengths parameter # 1 local > size 3 != parameter # 2 local size 4 > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > > [0]PETSC ERROR: ./ex19 on a named xrm by sajid Fri Nov 30 13:28:36 2018 > > > > Does this mean that PETSC_DECIDE is inconsistent between two calls in > the same file ? > > No. The reason for the error is a bug in the code. The x1 vector has a > block size of 1 hence when it is divided among two processes it gets 3 > entries on each process. The x2 vector has a block size of 2 so when it is > divided among two processes it gets 4 entries on the first process and 2 on > the second. The call to VecCopy() does not work because the two vectors > have different layouts. > > I have attached a fixed version of the code (that does not copy from x1 > to x2). > > Barry > > > > > > > Thank You, > > Sajid Ali > > Applied Physics > > Northwestern University > > -- Sajid Ali Applied Physics Northwestern University -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Nov 30 17:40:17 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Fri, 30 Nov 2018 23:40:17 +0000 Subject: [petsc-users] Error in vec/vec/examples/tutorials/ex19.c In-Reply-To: References: Message-ID: <9194331C-00CE-43C4-B01B-3BAFBE9CA54E@mcs.anl.gov> > On Nov 30, 2018, at 5:35 PM, Sajid Ali wrote: > > I thought I fixed it by using VecDuplicate instead of setting up x2 & copying but I got more confused as to why the VecWrite had different dimensions in each case. > > Does blocking take a Nx1 vector and convert it to N/BLOCK_SIZE x BLOCK_SIZE vector (that's how it looked it in the output hdf5 files) ? Sort of (and yes in the hdf5 file it is stored this way) in memory the block size just requires that each process have a number of blocks (that is part of a block cannot be on one process and the other part on the next one. Thus in splitting a vector of size 2 to two processes it gets different results for different block sizes. > > Since I didn't find TimeStep to be a feature of HDF5, am I right in assuming that it's a PETSc feature that creates an extra dimension to store multiple vectors (with the new dimension being indexed by PetscViewerHDF5SetTimeStep) ? Yes > > On Fri, Nov 30, 2018 at 5:17 PM Smith, Barry F. wrote: > > > > On Nov 30, 2018, at 1:30 PM, Sajid Ali via petsc-users wrote: > > > > Hi, > > > > I tried running ex19.c and I get the following error (I added a small snippet to print the local size on rank0 as well) : > > > > [sajid at xrm temp]$ mpirun -np 2 ./ex19 > > local_sizes : 3 4 > > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > [1]PETSC ERROR: Arguments are incompatible > > [1]PETSC ERROR: Incompatible vector local lengths parameter # 1 local size 3 != parameter # 2 local size 2 > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > > [1]PETSC ERROR: Petsc Release Version 3.10.2, unknown > > [1]PETSC ERROR: ./ex19 on a named xrm by sajid Fri Nov 30 13:28:36 2018 > > [1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > [0]PETSC ERROR: Arguments are incompatible > > [0]PETSC ERROR: Incompatible vector local lengths parameter # 1 local size 3 != parameter # 2 local size 4 > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.10.2, unknown > > [0]PETSC ERROR: ./ex19 on a named xrm by sajid Fri Nov 30 13:28:36 2018 > > > > Does this mean that PETSC_DECIDE is inconsistent between two calls in the same file ? > > No. The reason for the error is a bug in the code. The x1 vector has a block size of 1 hence when it is divided among two processes it gets 3 entries on each process. The x2 vector has a block size of 2 so when it is divided among two processes it gets 4 entries on the first process and 2 on the second. The call to VecCopy() does not work because the two vectors have different layouts. > > I have attached a fixed version of the code (that does not copy from x1 to x2). > > Barry > > > > > > > Thank You, > > Sajid Ali > > Applied Physics > > Northwestern University > > > > -- > Sajid Ali > Applied Physics > Northwestern University From bsmith at mcs.anl.gov Fri Nov 30 18:25:58 2018 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Sat, 1 Dec 2018 00:25:58 +0000 Subject: [petsc-users] create a block matrix from existing petsc matrices and/or vectors In-Reply-To: <1543591130109.22345@supmeca.fr> References: <1543591130109.22345@supmeca.fr> Message-ID: <50863E49-7A68-48AF-AA21-F4B03F3D921C@anl.gov> There really isn't any way in PETSc to generate the new matrix from the old matrix easily/trivially. You can create a new matrix with one additional column and extra row (on the final process) and have each process use MatGetRow() MatSetValues() to move the values to the new matrix for its local rows. Barry > On Nov 30, 2018, at 9:18 AM, NENNIG Benoit via petsc-users wrote: > > Dear petsc users, > > I have parallel matrix A (mpiaij) and I would like to create a matrix B like > B = [A v > wT 0 ] > where wT and v are vectors. > A is involved in eigenvalue computation (slepc4py) and B will be used by a direct solver. > Currently, the matrix A is built from a sparse scipy matrix thanks to A.createAIJ, but I would like to avoid to create B from scipy. I am looking for a more _petscic_ way to solve this problem. > > I have read that it seems not possible to change a matrix structure after its assembly. However > > - Is there a way to build directly in petsc this kind of block matrix B to reuse A ? > - Is there a way to convert the petsc matrix A into csr form and to create B as new petsc matrix ? > - Is there a way to copy a matrix into another one with a different structure ? > - ... > > My second question is similar, suppose I have three matrices A1, A2, C resulting form coupled 2 FEM problems, what is the good way to create the coupled global system > B = [A1 C > CT A2] > to use it with direct solver. > > Thanks a lot, > > Benoit