<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><div class=""><br class=""></div> Your FormFunction is not doing any communication of x values, thus the function evaluation can only use local values (same with the Jacobian). So these functions cannot be correct.<div class=""><br class=""></div><div class=""> If you look at the DMDA version you will see DMGlobalToLocal() that manages the communication inside FormFunction. If you do not want to use DMDA then you need to manage the needed communication yourself. This means figuring out what needs to be communicated (which depends on the type of grid and discretization you want to use) and setting up the communication; this can be done with VecScatterCreate(). </div><div class=""><br class=""></div><div class=""> Barry</div><div class=""><br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Apr 8, 2021, at 8:08 AM, Suwun Suwunnarat <<a href="mailto:ssuwunnarat@wesleyan.edu" class="">ssuwunnarat@wesleyan.edu</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div dir="ltr" class="">To whom it may concern,<div class=""><br class=""></div><div class="">I would like to modify the code in SNES's ex2.c to run in parallel.</div><div class=""><br class=""></div><div class=""><a href="https://www.mcs.anl.gov/petsc/petsc-3.6.4/src/snes/examples/tutorials/ex2.c.html" class="">https://www.mcs.anl.gov/petsc/petsc-3.6.4/src/snes/examples/tutorials/ex2.c.html</a><br class=""></div><div class=""><br class=""></div><div class="">I modified my ex2.c, as attached here. Even though ex3.c is the parallel version, I would like my implementation to "not contain" DMDA.</div><div class=""><br class=""></div><div class="">(Despite version 3.6.4, it works for version 3.15.0, which I use in my debugging purpose.)</div><div class=""><br class=""></div><div class="">The "crucial" parts of ex2.c that I modified are (apart from deletion of comments, exact solution verification, etc):</div><div class=""><br class=""></div><div class="">- (line 63-64) matrix preallocation</div><div class="">- use MatGetOwnershipRange on a PETSc matrix jac (in FormJacobian)</div><div class="">- use VecGetOwnershipRange on every PETSc vector in the program </div><div class=""><br class=""></div><div class="">I try to run the program with mpiexec. For n = 1 it works fine for -pc_type lu -snes_type newtonls. However, when I increase n to be n = 2, the code does not iterate further. (It does not go into the FormJacobian for after the first iteration.) I try several different pc_type and snes_type, but still do not get the result as obtained with n = 1.</div><div class=""><br class=""></div><div class="">I attached the terminal printout also.</div><div class=""><br class=""></div><div class="">Since the manual (3.15.0) said that (p81, section 2.4, 1st paragraph)</div><div class=""><br class=""></div><div class="">"Also, the SNES interface is identical for the uniprocess and parallel cases; the only difference in the parallel version is that each process typically forms only its local contribution to various matrices and vectors"</div><div class=""><br class=""></div><div class="">Therefore, I think that by modified only the vectors and matrices part, I should be able to get parallelism.</div><div class=""><br class=""></div><div class="">Regards,</div><div class="">Suwun</div><div class=""><br class=""></div><div class="">PS1: I compile using mpicc and use mpiexec found in the directory ${PETSC_DIR}/arch-mumps-opt/bin/ to run the code.</div><div class="">PS2: I attached the install.sh (I install with MUMPS) that I use to install PETSc. Note that I also turn on --with-debuggin=yes</div><div class="">PS3: my OS is Ubuntu 20.04.2. Intel CPU. My environment should have everything necessary (gfortran, gcc, g++, valgrind, cmake).</div></div>
<span id="cid:f_kn8w57w71"><terminal_printout.html></span><span id="cid:f_kn8w50dr0"><ex2.c></span><span id="cid:f_kn8w5r0a2"><install.sh></span></div></blockquote></div><br class=""></div></body></html>