[petsc-users] Data decomposition in PETsc for Poisson solver

Michele Rosso mrosso at uci.edu
Fri Jun 29 19:22:10 CDT 2012


Hi Jed,

thank you for your reply.
I checked the PETSc version currently installed on my system (Blue Gene 
P); it is 3.1-p2
so I will use the DA type.
Also, I cannot take advantage of DMDAGlobalToNatural since version 3.1 
does not support it.
I was able to solve a simple Poisson system successfully but the 
solution vector is not in the correct order
and I do not understand how to fix this. Is there an equivalent to 
DMDAGlobalToNatural in version 3.1?

Thank you,
Michele


On 06/26/2012 11:36 AM, Jed Brown wrote:
> On Tue, Jun 26, 2012 at 10:24 AM, Michele Rosso <mrosso at uci.edu 
> <mailto:mrosso at uci.edu>> wrote:
>
>     Thanks Jed for your reply.
>
>     1) I will use the DMDA object type then. I am still not very clear
>     about the difference between DA and DMDA though.
>
>
> DM is the generic interface, DMDA is the structured grid 
> implementation. It used to be called just DA and managed a bit 
> different. Maybe you were looking at old docs?
>
>
>     2) I am not interested in having ghost nodes updated. I want the
>     proper values of the solution on the proper processor.
>          I have to fill the known-terms-vector with nodes-dependent
>     values ( in contrast with ex45f.F, where vector b is filled with
>     1s, thus there is
>          no dependence on the grid location). Since every processor
>     defines "non-local" (according to the PETSc internal ordering)
>     components, the vector is re-arranged
>          and so is the solution vector.  So I will have on every
>     process a solution  which partially should be on a difference
>     process. And this is not about ghost cell.
>         Sorry for this long explanation but I am trying to be as clear
>     as I can.
>
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMDAGlobalToNaturalBegin.html
>
>
>     Thank you fro your help and patience,
>
>     Michele
>
>
>
>
>
>
>
>
>
>     On 06/26/2012 10:42 AM, Jed Brown wrote:
>>     On Tue, Jun 26, 2012 at 9:37 AM, Michele Rosso <mrosso at uci.edu
>>     <mailto:mrosso at uci.edu>> wrote:
>>
>>         Hi,
>>
>>         I need some help to use the PETSc library inside my Fortran
>>         95 code.
>>         My goal is to solve a symmetric linear system that derives
>>         from the finite difference discretization
>>         of the Poisson equation. I will use the preconditioned
>>         conjugate method.
>>
>>         I am mostly interested in how to decompose my data among the
>>         different processes.
>>         In particular:
>>
>>         1) Since my code already implements the 2D-decomposition,
>>         would it be best to build the matrix with the  DMDA object
>>         type, DA object type
>>              or the regular Mat type?
>>
>>
>>     It is certainly easiest to just use DMDA (and you will get
>>     geometric multigrid for free, which is unbeatable for this
>>     problem), but you can work directly with Mat if you prefer. See
>>     src/ksp/ksp/examples/tutorials/ex45f.F for a Fortran example.
>>
>>
>>         2) After inserting values into a vector/matrix, PETSc
>>         performs any needed message passing of nonlocal components,
>>         thus the values locally contained on a process  may be
>>         communicated to another process.  How can I revert this at
>>         the end of the computation, that is, how can I be sure that
>>         the local solution vector contains the values associated to
>>         the grid nodes contained into the hosting process?
>>
>>
>>     Please read the section of the user's manual on local versus
>>     global spaces and the structured grid decompositions. If you use
>>     DM, there is DMGlobalToLocalBegin/End that update the ghost
>>     points in the local vectors.
>>
>>
>>
>>         Thank you,
>>
>>         Michele
>>
>>
>
>
>


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120629/2cefc7e0/attachment.html>


More information about the petsc-users mailing list