[petsc-users] Data decomposition in PETsc for Poisson solver
Jed Brown
jedbrown at mcs.anl.gov
Tue Jun 26 13:36:13 CDT 2012
On Tue, Jun 26, 2012 at 10:24 AM, Michele Rosso <mrosso at uci.edu> wrote:
> Thanks Jed for your reply.
>
> 1) I will use the DMDA object type then. I am still not very clear about
> the difference between DA and DMDA though.
>
DM is the generic interface, DMDA is the structured grid implementation. It
used to be called just DA and managed a bit different. Maybe you were
looking at old docs?
>
> 2) I am not interested in having ghost nodes updated. I want the proper
> values of the solution on the proper processor.
> I have to fill the known-terms-vector with nodes-dependent values (
> in contrast with ex45f.F, where vector b is filled with 1s, thus there is
> no dependence on the grid location). Since every processor defines
> "non-local" (according to the PETSc internal ordering) components, the
> vector is re-arranged
> and so is the solution vector. So I will have on every process a
> solution which partially should be on a difference process. And this is
> not about ghost cell.
> Sorry for this long explanation but I am trying to be as clear as I
> can.
>
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMDAGlobalToNaturalBegin.html
>
> Thank you fro your help and patience,
>
> Michele
>
>
>
>
>
>
>
>
>
> On 06/26/2012 10:42 AM, Jed Brown wrote:
>
> On Tue, Jun 26, 2012 at 9:37 AM, Michele Rosso <mrosso at uci.edu> wrote:
>
>> Hi,
>>
>> I need some help to use the PETSc library inside my Fortran 95 code.
>> My goal is to solve a symmetric linear system that derives from the
>> finite difference discretization
>> of the Poisson equation. I will use the preconditioned conjugate method.
>>
>> I am mostly interested in how to decompose my data among the different
>> processes.
>> In particular:
>>
>> 1) Since my code already implements the 2D-decomposition, would it be
>> best to build the matrix with the DMDA object type, DA object type
>> or the regular Mat type?
>>
>
> It is certainly easiest to just use DMDA (and you will get geometric
> multigrid for free, which is unbeatable for this problem), but you can work
> directly with Mat if you prefer. See src/ksp/ksp/examples/tutorials/ex45f.F
> for a Fortran example.
>
>
>>
>> 2) After inserting values into a vector/matrix, PETSc performs any needed
>> message passing of nonlocal components, thus the values locally contained
>> on a process may be communicated to another process. How can I revert
>> this at the end of the computation, that is, how can I be sure that the
>> local solution vector contains the values associated to the grid nodes
>> contained into the hosting process?
>>
>
> Please read the section of the user's manual on local versus global
> spaces and the structured grid decompositions. If you use DM, there is
> DMGlobalToLocalBegin/End that update the ghost points in the local vectors.
>
>
>>
>>
>> Thank you,
>>
>> Michele
>>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120626/60bc3983/attachment.html>
More information about the petsc-users
mailing list