[petsc-users] newbie question on the parallel allocation of matrices

Matthew Knepley knepley at gmail.com
Fri Dec 2 10:03:17 CST 2011


On Fri, Dec 2, 2011 at 9:59 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> On Dec 2, 2011, at 9:52 AM, Treue, Frederik wrote:
>
> >
> >
> > From: petsc-users-bounces at mcs.anl.gov [mailto:
> petsc-users-bounces at mcs.anl.gov] On Behalf Of Matthew Knepley
> > Sent: Friday, December 02, 2011 4:32 PM
> > To: PETSc users list
> > Subject: Re: [petsc-users] newbie question on the parallel allocation of
> matrices
> >
> > On Fri, Dec 2, 2011 at 9:25 AM, Treue, Frederik <frtr at risoe.dtu.dk>
> wrote:
> >
> >
> > From: petsc-users-bounces at mcs.anl.gov [mailto:
> petsc-users-bounces at mcs.anl.gov] On Behalf Of Matthew Knepley
> > Sent: Friday, December 02, 2011 4:01 PM
> > To: PETSc users list
> > Subject: Re: [petsc-users] newbie question on the parallel allocation of
> matrices
> >
> > On Fri, Dec 2, 2011 at 8:58 AM, Treue, Frederik <frtr at risoe.dtu.dk>
> wrote:
> >
> >
> > From: petsc-users-bounces at mcs.anl.gov [mailto:
> petsc-users-bounces at mcs.anl.gov] On Behalf Of Jed Brown
> > Sent: Friday, December 02, 2011 1:32 PM
> > To: PETSc users list
> > Subject: Re: [petsc-users] newbie question on the parallel allocation of
> matrices
> >
> > On Fri, Dec 2, 2011 at 03:32, Treue, Frederik <frtr at risoe.dtu.dk> wrote:
> > OK, but that example seems to assume that you wish to connect only one
> matrix (the Jacobian) to a DA – I wish to specify many and I think I found
> this done in ksp ex39, is that example doing anything deprecated or will
> that work for me, e.g. with the various basic mat routines (matmult,
> matAXPY etc.) in a multiprocessor setup?
> >
> > What do you mean by wanting many matrices? How do you want to use them?
> There is DMCreateMatrix() (misnamed DMGetMatrix() in petsc-3.2), which you
> can use as many times as you want.`
> >
> > And this was the one I needed. However I have another question: What
> does DMDA_BOUNDARY_GHOSTED do, compared to DMDA_BOUNDARY_PERIODIC? From
> experience I now know that the PERIODIC option automagically does the right
> thing when I’m defining matrices so I can simply specify the same stencil
> at all points. Does DMDA_BOUNDARY_GHOSTED do something similar?
>
>    No, nothing to do with matrices because that extra point is a fixed
> (Dirichlet) value and the so the derivative of the contribution for that
> value is zero.
>
> > And if so, how is it controlled, ie. How do I specify if I’ve got
> Neumann or Dirichlet conditions, and what order extrapolation you want, and
> so forth? And if not, does it then ONLY make a difference if I’m working
> with more than on processor, ie. If everything is sequential, is
> DMDA_BOUNDARY_GHOSTED and DMDA_BOUNDARY_NONE equivalent?
>
> No, the are not. This option has nothing to do with parallelism.
> >
> > GHOSTED adds extra space at the boundary so you can always use the same
> stencil, but you decide what goes in there.
> >
> > Does this apply to both matrices and vectors, ie. Will the ghost points
> be considered part of my computational domain or not?
> >
> > The ghost nodes only exist in local vectors, not the global vectors for
> the solver.
> >
> > OK? So how does one implement boundary conditions? Normally I would
> include (say) one extra point over the edge of the domain (the ghost point)
> and then implement the equation (if I start out with Ax=b, A and b known, x
> desired, and dirichlet boundary conditions)
> > x_G+x_1=2*b_B, where x_G is the unknown at the ghost point, x_1 is the
> unknown at the first “real” point, and b_B is my dirichlet boundary
> condition.
> > Thus, I need to a special stencil in the first and last row (ie. The -1
> and nx row, with nx internal points) of my matrix, but this leads to memory
> errors. Is this possible while using GHOSTED? As I understand it, GHOSTED
> also deals with the MPI communication, so I’d like to retain it, instead of
> working with NONE.
>
>    Ghosted has nothing to do with MPI communication!  Ghosted vs None is
> only about the physical boundary, nothing to do with the boundary between
> domains.
>
>    Ghosted is only for the (nonlinear) function evaluation. It is not for
> storage of the Jacobian/matrix.
>
>  I suggest just using NONE and make your life and understanding easier.
> You can do Neuman and Dirichlet boundary conditions with none just fine.


You can see us apply Dirichlet conditions with NONE in SNES ex5

   Matt


>
>   Barry
>
>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111202/e8af2ce6/attachment-0001.htm>


More information about the petsc-users mailing list