[petsc-users] Diagnosing Poisson Solver Behavior

K. N. Ramachandran knram06 at gmail.com
Thu Oct 15 11:15:13 CDT 2015


Hello Matt,

On Thu, Oct 15, 2015 at 11:46 AM, Matthew Knepley <knepley at gmail.com> wrote:

>
> This seems like the hardest way to do this. We have running examples, that
> scale well, and produce exactly the matrix
> you are using. In addition, they create the matrix in parallel, so the
> whole thing is scalable.
>
> In order to do it the way you want, you would need to match the
> partitioning, which is hard.
>

Yes, I understand. I am trying to keep the solver module separate from the
problem physics (like boundary conditions), i.e. my code has knowledge of
the physics and I generate the matrix in sparse format. I then hand it over
to some Solver module (say, PETSc) which can efficiently solve this.

Examples like snes/ex5.c, ksp/ex45.c, etc. use knowledge of the boundary
conditions to fill up the matrix and the RHS and this makes sense, if the
entire code is written using PETSc's native data structures. But from a
modularity point of view, if my problem needs a different partitioning
based on the physics, I feel like my code should generate the relevant
matrices on each rank and send it to PETSc.

I am trying to simulate the movement of charged particles in a domain, and
these particles become collimated along a particular direction. So one way
to partition might be 1D slabs perpendicular to the collimated direction,
so each rank can calculate the field and move the particles on each
timestep. Of course communication across the slabs would be inevitable. But
this way, the partitioning is based on the physics of the problem and the
solver module just efficiently solves the system without having to know the
boundary conditions, which makes sense to me overall.

Hope this description helps.


Thanking You,
K.N.Ramachandran
Ph: 814-441-4279
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151015/b760589f/attachment.html>


More information about the petsc-users mailing list