<div dir="ltr"><div dir="ltr">On Tue, May 31, 2022 at 12:08 PM Sami BEN ELHAJ SALAH <<a href="mailto:sami.ben-elhaj-salah@ensma.fr">sami.ben-elhaj-salah@ensma.fr</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><div><div>Hi Matthew,</div><div><br></div><div><u><b>Two question in this mail:</b></u></div><div><u><b><br></b></u></div><div>I tested my code <font color="#0433ff"><b>with 2 processes</b></font>. I use gmsh mesh( example: 8 hexahedral elements, 27 nodes, and Global Size of my jacobian matrix = 81 ). Then I used DMView(dm_mesh, PETSC_VIEWER_STDOUT_WORLD) to visualize my DM and see if it is correctly distributed over processes, so I got this:</div><div><br></div><div><font color="#ff2600" size="4"><b>(***)</b></font></div><div><b>DM Object: DM_0x3_0 2 MPI processes</b></div><div><b> type: plex</b></div><div><b>DM_0x3_0 in 3 dimensions:</b></div><div><b> 0-cells: 27 0</b></div><div><b> 3-cells: 8 0</b></div></div></div></blockquote><div><br></div><div>Notice here that your whole mesh is on process 0. You would probably call DMPlexDistribute() to rebalance it.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><div><div><b>Labels:</b></div><div><b> material-id: 3 strata with value/size (66 (4), 67 (2), 68 (2))</b></div><div><b> depth: 2 strata with value/size (0 (27), 1 (8))</b></div><div><b> celltype: 2 strata with value/size (7 (8), 0 (27))</b></div><div><br></div><div>As you see, I created some material-id labels, so I found them over my 8 cells when using DMView. So it seems good to me. The SNES viewer is shown below:</div><div><br></div><div>SNES Object: 2 MPI processes</div><div> type: ksponly</div><div> SNES has not been set up so information may be incomplete</div><div> maximum iterations=50, maximum function evaluations=10000</div><div> tolerances: relative=1e-08, absolute=1e-50, solution=1e-08</div><div> total number of linear solver iterations=0</div><div> total number of function evaluations=0</div><div> norm schedule ALWAYS</div><div>SNES Object: 2 MPI processes</div><div> type: ksponly</div><div> SNES has not been set up so information may be incomplete</div><div> maximum iterations=50, maximum function evaluations=10000</div><div> tolerances: relative=1e-08, absolute=1e-50, solution=1e-08</div><div> total number of linear solver iterations=0</div><div> total number of function evaluations=0</div><div> norm schedule ALWAYS</div><div> SNESLineSearch Object: 2 MPI processes</div><div> type: bt</div><div> interpolation: cubic</div><div> alpha=1.000000e-04</div><div> SNESLineSearch Object: 2 MPI processes</div><div> type: bt</div><div> interpolation: cubic</div><div> alpha=1.000000e-04</div><div> maxstep=1.000000e+08, minlambda=1.000000e-12</div><div> tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08</div><div> maximum iterations=40</div><div> maxstep=1.000000e+08, minlambda=1.000000e-12</div><div> tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08</div><div> maximum iterations=40</div><div> KSP Object: 2 MPI processes</div><div> type: gmres</div><div> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement</div><div> KSP Object: 2 MPI processes</div><div> type: gmres</div><div> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement</div><div> happy breakdown tolerance 1e-30</div><div> maximum iterations=10000, initial guess is zero</div><div> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.</div><div> left preconditioning</div><div> using DEFAULT norm type for convergence test</div><div> PC Object: 2 MPI processes</div><div> happy breakdown tolerance 1e-30</div><div> maximum iterations=10000, initial guess is zero</div><div> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.</div><div> left preconditioning</div><div> using DEFAULT norm type for convergence test</div><div> PC Object: 2 MPI processes</div><div> type: bjacobi</div><div> PC has not been set up so information may be incomplete</div><div> number of blocks = -1</div><div> Local solve is same for all blocks, in the following KSP and PC objects:</div><div> linear system matrix = precond matrix:</div><div> type: bjacobi</div><div> PC has not been set up so information may be incomplete</div><div> number of blocks = -1</div><div> Local solve is same for all blocks, in the following KSP and PC objects:</div><div> linear system matrix = precond matrix:</div><div> Mat Object: 2 MPI processes</div><div> type: mpiaij</div><div> Mat Object: 2 MPI processes</div><div> type: mpiaij</div><div> rows=81, cols=81, bs=3</div><div> rows=81, cols=81, bs=3</div><div> total: nonzeros=3087, allocated nonzeros=3087</div><div> total number of mallocs used during MatSetValues calls=0</div><div> total: nonzeros=3087, allocated nonzeros=3087</div><div> total number of mallocs used during MatSetValues calls=0</div><div> using I-node (on process 0) routines: found 27 nodes, limit used is 5</div><div> not using I-node (on process 0) routines</div><div><br></div><div><b><font color="#ff2600">Question 1:</font></b></div><div><font color="#0433ff"><b>B</b><span><b>ased on the result given by DMView (see </b></span></font><b style="color:rgb(255,38,0)">(***)</b><font color="#0433ff"><b> </b></font><b style="color:rgb(4,51,255)">), </b><b style="color:rgb(4,51,255)">I didn't understand if my mesh is correctly distributed </b><b style="color:rgb(4,51,255)">? or my code is missing something ? because when I visualize the local and global sizes of my Jacobian matrix, I found</b></div><div><br></div><div><b>PETSc::NonLinearSolver::INIT Size from Jac Matrix: M=81 m =0 //(M: global size, m: local size) this result is given by the proc 1</b></div><div><b>and </b></div><div><b>PETSc::NonLinearSolver::INIT Size from Jac Matrix: M=81 m =81 // </b><b>this result is given by the proc 2</b></div></div></div></blockquote><div><br></div><div>Yes, your mesh in only on process 0.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><div><div>Let me give other information:</div><div>I create my jacobian matrix using:</div><div>PetscErrorCode err = DMCreateMatrix(dm_mesh, &_matrix);</div><div><b>and I use the PetscSection to tell the DM to use this data layout.</b></div><div><br></div><div>In my code I wrote this line: </div><div>DMSetFromOptions(dm_mesh);</div><div>Then, to run my code I use</div><div>mpirun -np 2 /home/benelhasa/fox_petsc/build_test/bin/Debug/FoXtroT -snes_test_jacobian_view -snes_converged_reason -snes_monitor -ksp_monitor -ksp_xmonitor -dm_view <b><font color="#ff2600">-petscpartitioner_type parmetis -dm_distribute</font></b> -dm_refine 0 cub_8C3D8.fxt</div></div></div></blockquote><div><br></div><div>Something else is wrong in how this code is creating the mesh, because it is not distributing the mesh. Can you just use</div><div><br></div><div> PetscCall(DMCreate(comm, dm));<br> PetscCall(DMSetType(*dm, DMPLEX));<br> PetscCall(DMSetFromOptions(*dm));<br></div><div><br></div><div>as we do in the examples and then</div><div><br></div><div> -dm_plex_filename <your gmsh file></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><div><div><b><font color="#ff2600">Question 2:</font></b></div><div><b><font color="#0433ff">I have another question, just for the comprehension. I understand that DMSetFromOptions(dm_mesh) allows me to use the parameter </font><font color="#ff2600">dm_distribute</font><font color="#0433ff"> but I didn't understand how I can use -petscpartitioner_type parmetis argument (see this example that I used to help me (</font></b><span><b><a href="https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c" target="_blank">https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c</a></b></span><b><font color="#0433ff">), may be when the DM uses a data layout either by PetscSection or PetscDS, then I can use automatically the -petscpartitioner_type parmetis/simple/scotch ?? (Can you tell me more about this topic please) </font></b></div></div></div></blockquote><div><br></div><div>I do not yet understand the question. The partitioner only applies to the topology, not to functions on the mesh.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><div><div dir="auto" style="color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">Ps: My solution is converging when I use 1 process and I don’t have problem.</div><div dir="auto" style="color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><br></div><div dir="auto" style="color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">--</div><div dir="auto" style="color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">Dr. Sami BEN ELHAJ SALAH<br>Ingénieur de Recherche (CNRS)<br>Institut Pprime - ISAE - ENSMA<br>Mobile: 06.62.51.26.74<br><a href="mailto:sami.ben-elhaj-salah@ensma.fr" target="_blank">Email: sami.ben-elhaj-salah@ensma.fr</a></div><div dir="auto" style="color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><a href="https://samiben91.github.io/samibenelhajsalah/index.html" target="_blank">www.samibenelhajsalah.com</a><br><br><br></div></div></div></div>
</div>
<div><br><blockquote type="cite"><div>Le 29 mai 2022 à 18:02, Sami BEN ELHAJ SALAH <<a href="mailto:sami.ben-elhaj-salah@ensma.fr" target="_blank">sami.ben-elhaj-salah@ensma.fr</a>> a écrit :</div><br><div><div style="overflow-wrap: break-word;"><div><div>Hi Matthew,</div><div>Thank you for this example. It seems exactly what I am looking for.</div><div>Thank you again for your help and have a good day.</div><div>Sami,</div><div>
<div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">--</div><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">Dr. Sami BEN ELHAJ SALAH<br>Ingénieur de Recherche (CNRS)<br>Institut Pprime - ISAE - ENSMA<br>Mobile: 06.62.51.26.74<br><a href="mailto:sami.ben-elhaj-salah@ensma.fr" target="_blank">Email: sami.ben-elhaj-salah@ensma.fr</a></div><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><a href="https://samiben91.github.io/samibenelhajsalah/index.html" target="_blank">www.samibenelhajsalah.com</a><br><br><br></div></div></div></div>
</div>
<div><br><blockquote type="cite"><div>Le 28 mai 2022 à 20:20, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> a écrit :</div><br><div><div dir="ltr" style="font-family:Helvetica;font-size:12px;font-style:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="ltr">On Sat, May 28, 2022 at 2:19 PM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Sat, May 28, 2022 at 1:35 PM Sami BEN ELHAJ SALAH <<a href="mailto:sami.ben-elhaj-salah@ensma.fr" target="_blank">sami.ben-elhaj-salah@ensma.fr</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div>Hi Matthew,</div><div><br></div><div>Thank you for your response.</div><div><br></div><div>I don't have that. My DM object is not linked to PetscSection yet. I'll try that. </div><div>Is there any example that manages this case? (DMPlexDistribute & PetscSection) or any guideline will be helpful.</div><div></div></div></blockquote><div><br></div><div>Here is an example where we create a section without DS.</div></div></div></blockquote><div><br></div><div>Forgot the link: <a href="https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c" target="_blank">https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c</a></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div> <span> </span>THanks,</div><div><br></div><div> <span> </span>Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div>Thanks in advance,</div><div><br></div><div>Sami,</div><div><br></div><div><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">--</div><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">Dr. Sami BEN ELHAJ SALAH<br>Ingénieur de Recherche (CNRS)<br>Institut Pprime - ISAE - ENSMA<br>Mobile: 06.62.51.26.74<br><a href="mailto:sami.ben-elhaj-salah@ensma.fr" target="_blank">Email: sami.ben-elhaj-salah@ensma.fr</a></div><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><a href="https://samiben91.github.io/samibenelhajsalah/index.html" target="_blank">www.samibenelhajsalah.com</a><br><br><br></div></div></div></div></div><div><br><blockquote type="cite"><div>Le 27 mai 2022 à 20:45, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> a écrit :</div><br><div><div dir="ltr"><div dir="ltr">On Fri, May 27, 2022 at 9:42 AM Sami BEN ELHAJ SALAH <<a href="mailto:sami.ben-elhaj-salah@ensma.fr" target="_blank">sami.ben-elhaj-salah@ensma.fr</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><div>Hello Isaac,</div><div><br></div><div>Thank you for your reply!</div><div><br></div><div>Let me confirm that when I use DMCreateMatrix() with the orig_dm, I got my jacobian_matrix. Also, I have succeeded to solve my system and my solution was converged<b><span> </span>by using one process</b>.</div><div>Let me give you some other information about my code. Currently, I am using my own discretization system and not the PetscDS object. Considering the nonlinear solver SNES, I use the following routines (like the basic snes usage):</div><div>- SNESSetFunction(snes,residual_vector,compute_residual ,(void*) _finite_element_formulation)</div><div>- SNESSetJacobian(snes, jacobian.matrix(), _jacobian_precond_matrix, compute_jacobian, (void*) _finite_element_formulation) </div><div>- SNESSolve(snes,NULL,x)</div><div><br></div><div>Regarding your last answer, I will try to reformulate my question as follows:</div><div>Using a distributed dm instead of the original dm (not distributed dm) and my own discretization system (not the PetscDS object),</div></div></div></blockquote><div><br></div><div>You do not have to use PetscDS, but the DM does need to a PetscSection in order to compute sizes and sparsity patterns. Do you have that?</div><div><br></div><div> <span> </span>Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><div>should I add something specific to get a distributed jacobian_matrix over processes?</div><div>I precise that I just replaced the orig_dm by a distributed mesh with the routine that I wrote in my first mail. So is it enough ?</div><div> </div><div>Thank you and have a good day,</div><div>Sami,</div></div><div><br></div><br><div><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">--</div><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none">Dr. Sami BEN ELHAJ SALAH<br>Ingénieur de Recherche (CNRS)<br>Institut Pprime - ISAE - ENSMA<br>Mobile: 06.62.51.26.74<br><a href="mailto:sami.ben-elhaj-salah@ensma.fr" target="_blank">Email: sami.ben-elhaj-salah@ensma.fr</a></div><div dir="auto" style="letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><a href="https://samiben91.github.io/samibenelhajsalah/index.html" target="_blank">www.samibenelhajsalah.com</a><br><br><br></div></div></div></div></div><div><br><blockquote type="cite"><div>Le 25 mai 2022 à 19:41, Toby Isaac <<a href="mailto:toby.isaac@gmail.com" target="_blank">toby.isaac@gmail.com</a>> a écrit :</div><br><div><div>Hi Sami,<br><br>Just to verify: if you call DMCreateMatrix() on orig_dm, do you get a<br>Jacobian matrix?<br><br>The DMPlex must be told what kind of discretized fields you want a<br>matrix for and what equations you are discretizing. This is handled<br>by the PetscDS object. In snes/tutorials/ex59.c, see the code after<br>DMGetDS() for an example.<br><br>- Toby<br><br>On Wed, May 25, 2022 at 1:17 PM Sami BEN ELHAJ SALAH<br><<a href="mailto:sami.ben-elhaj-salah@ensma.fr" target="_blank">sami.ben-elhaj-salah@ensma.fr</a>> wrote:<br><blockquote type="cite"><br>Dear PETSc developer team,<br><br>I m trying to create à jacobian_matrix from distributed DM. I have followed the two examples (snes/tests/ex2.c and ex56.c). So I wrote this routine:<br><br>PetscDM orig_dm;<br>PetscDM dist_dm = NULL;<br>PetscPartitioner part;<br>DMPlexGetPartitioner(orig_dm, &part);<br>PetscPartitionerSetType(part, PETSCPARTITIONERPARMETIS);<br>DMPlexDistribute(orig_dm, 0, NULL, &dist_dm);<br><br>PetscErrorCode err = DMCreateMatrix(dist_dm, &jacobian_matrix);<br>PetscInt M, N, m, n;<br>MatGetSize(jacobian_matrix, &M, &N);<br>MatGetLocalSize(jacobian_matrix, &m, &n);<br><br>Then I run my code with 2 processes and I obtained this result:<br>Size from jacobian_matrix: M=0 m =0 (this result is the same in all processes).<br><br>I didn't understand if I forgot something in my code to obtain the correct values for the local and global sizes for the jacobean_matrix? (I mean if my code is missing something to obtain a distributed mesh over processes ?)<br><br>Thank you in advance for any help!<br>Sami,<br><br><br>--<br>Dr. Sami BEN ELHAJ SALAH<br>Ingénieur de Recherche (CNRS)<br>Institut Pprime - ISAE - ENSMA<br>Mobile: 06.62.51.26.74<br><a href="mailto:sami.ben-elhaj-salah@ensma.fr" target="_blank">Email: sami.ben-elhaj-salah@ensma.fr</a><br><a href="http://www.samibenelhajsalah.com/" target="_blank">www.samibenelhajsalah.com</a><br><br><br><br></blockquote></div></div></blockquote></div><br></div></blockquote></div><br clear="all"><div><br></div>--<span> </span><br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div></div></blockquote></div><br></div></blockquote></div><br clear="all"><div><br></div>--<span> </span><br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div></blockquote></div><br clear="all"><div><br></div>--<span> </span><br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a></div></div></div></div></div></div></div></div></div></blockquote></div><br></div></div></div></blockquote></div><br></div></blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>