[petsc-users] Running SNES in parallel

Barry Smith bsmith at petsc.dev
Thu Apr 8 12:17:37 CDT 2021


   Your FormFunction is not doing any communication of x values, thus the function evaluation can only use local values (same with the Jacobian). So these functions cannot be correct.

    If you look at the DMDA version you will see DMGlobalToLocal() that manages the communication inside FormFunction. If you do not want to use DMDA then you need to manage the needed communication yourself. This means figuring out what needs to be communicated (which depends on the type of grid  and discretization you want to use) and setting up the communication; this can be done with VecScatterCreate(). 

  Barry


> On Apr 8, 2021, at 8:08 AM, Suwun Suwunnarat <ssuwunnarat at wesleyan.edu> wrote:
> 
> To whom it may concern,
> 
> I would like to modify the code in SNES's ex2.c to run in parallel.
> 
> https://www.mcs.anl.gov/petsc/petsc-3.6.4/src/snes/examples/tutorials/ex2.c.html <https://www.mcs.anl.gov/petsc/petsc-3.6.4/src/snes/examples/tutorials/ex2.c.html>
> 
> I modified my ex2.c, as attached here. Even though ex3.c is the parallel version, I would like my implementation to "not contain" DMDA.
> 
> (Despite version 3.6.4, it works for version 3.15.0, which I use in my debugging purpose.)
> 
> The "crucial" parts of ex2.c that I modified are (apart from deletion of comments, exact solution verification, etc):
> 
> - (line 63-64) matrix preallocation
> - use MatGetOwnershipRange on a PETSc matrix jac (in FormJacobian)
> - use VecGetOwnershipRange on every PETSc vector in the program 
> 
> I try to run the program with mpiexec. For n = 1 it works fine for -pc_type lu -snes_type newtonls. However, when I increase n to be n = 2, the code does not iterate further. (It does not go into the FormJacobian for after the first iteration.) I try several different pc_type and snes_type, but still do not get the result as obtained with n = 1.
> 
> I attached the terminal printout also.
> 
> Since the manual (3.15.0) said that (p81, section 2.4, 1st paragraph)
> 
> "Also, the SNES interface is identical for the uniprocess and parallel cases; the only difference in the parallel version is that each process typically forms only its local contribution to various matrices and vectors"
> 
> Therefore, I think that by modified only the vectors and matrices part, I should be able to get parallelism.
> 
> Regards,
> Suwun
> 
> PS1: I compile using mpicc and use mpiexec found in the directory ${PETSC_DIR}/arch-mumps-opt/bin/ to run the code.
> PS2: I attached the install.sh (I install with MUMPS) that I use to install PETSc. Note that I also turn on --with-debuggin=yes
> PS3: my OS is Ubuntu 20.04.2. Intel CPU. My environment should have everything necessary (gfortran, gcc, g++, valgrind, cmake).
> <terminal_printout.html><ex2.c><install.sh>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210408/e37fa440/attachment.html>


More information about the petsc-users mailing list