<div dir="ltr">Hi Vittorio,<div><br></div><div>PETSc does provide support for your application and some of us (eg, me and Matt) work with fusion PIC applications.</div><div><br></div><div>1) I am not sure how you handle boundary conditions with a Cartesian grid so let me give two responses:</div><div><br></div><div>1.1) With Cartesian grids, geometric multigrid may be usable and that can be fast and easier to use.</div><div>PETSc supports geometric and algebraic multigrid, including interfaces to third party libraries like hypre.</div><div>Hypre is an excellent solver, but you can probably use CG as your KSP method instead of GMRES.</div><div><br></div><div>1.2) PETSc provides support for unstructured mesh management and discretizations and you switch to an unstructured grid, but I understand we all have priorities.</div><div>Unstructured grids are probably a better long term solution for you.</div><div><br></div><div>2) PETSc is portable with linear algebra back-ends that execute on any "device".</div><div>Our OpenMP support is only through the Kokkos back-end and we have custom CUDA and HIP backends that are built on vendor libraries.</div><div>The Kokkos back-end also supports CUDA, HIP and SYCL and we rely on Kokkos any other architectures at this point.</div><div>BTW, the Kokkos back-end also has an option to use vendor back-ends or Kokkos Kernels for linear algebra and they are often better than the vendors libraries.</div><div><br></div><div>Hope this helps,</div><div>Mark</div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Dec 15, 2023 at 12:41 AM Vittorio Sciortino <<a href="mailto:vittorio.sciortino@uniba.it">vittorio.sciortino@uniba.it</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
Dear PETSc developers,<br>
<br>
My name is Vittorio Sciortion, I am a PhD student in Italy and I am <br>
really curious about the applications and possibilities of your <br>
library. I would ask you two questions about PETSc.<br>
<br>
My study case consists in the development of a 2D electrostatic Particle <br>
In Cell code which simulates a plasma interacting with the shaped <br>
surface of adjacent divertor mono-blocks.<br>
This type of scenario requires to solve the electro-static Poisson <br>
equation on the whole set of grid nodes (a cartesian grid) applying some <br>
boundary conditions.<br>
Currently, we are using the KSPSolve subroutine set to apply the gmres <br>
iterative method in conjunction with hypre (used as pre-conditioner).<br>
Some boundary conditons are necessary for our specific problem <br>
(Dirichlet and Neumann conditions on specific line of points).<br>
I have two small curiosity about the possibilities offered by your <br>
library, which is very interesting:<br>
<br>
1. are we using the best possible pair to solve our problem?<br>
<br>
2. currently, PETSc is compiled with openMP parallelization and the <br>
iterative method is executed on the CPU.<br>
Is it possible to configure the compilation of our library to execute <br>
these iterations on a nVidia GPU? Which are the best compilation options <br>
that you suggest for your library?<br>
<br>
thank you in advance<br>
Greetings<br>
Vittorio Sciortino<br>
PhD student in Physics<br>
Bari, Italy<br>
<br>
Recently, I sent a subscribe request to the users mailing list using <br>
another e-mail, because this one could be deactivated in two/three <br>
months. private email: <a href="mailto:vsciortino.phdcourse@gmail.com" target="_blank">vsciortino.phdcourse@gmail.com</a><br>
-- <br>
Vittorio Sciortino<br>
________________________________________________________________________________________________<br>
Sostieni la formazione e la ricerca universitaria con il tuo 5 per mille <br>
all'Università di Bari.<br>
Firma la casella "Finanziamento della ricerca scientifica e della <br>
Università"<br>
indicando il codice fiscale 80002170720.<br>
<br>
Il tuo contributo può fare la differenza: oggi più che mai!<br>
________________________________________________________________________________________________<br>
</blockquote></div>