DA question

Amit.Itagi at seagate.com Amit.Itagi at seagate.com
Wed Apr 30 15:27:57 CDT 2008


owner-petsc-users at mcs.anl.gov wrote on 04/29/2008 01:28:17 PM:

>
>    If you are running a true explicit scheme then you have no need
> to ever have a "global representation" at each time step. In this
> case you can use DALocalToLocalBegin() then DALocalToLocalEnd()
> and pass the same vector in both locations. This will update the ghost
> points but WILL NOT do any copy of the local data since it is already
> in the correct locations.

Barry,

I implemented an explicit scheme using your suggestion. The scheme seems to
work. Now I want to output the data to a file (according to the natural
ordering of a 3D array). I guess, I would need a global vector for this.
Hence, I did

DACreateGlobalVector
DALocalToGlocal
DAGetAO

Thus, I have a global vector and the AO.

Now, how do access the vector elements in the AO order ? Eventually, I will
use PetscFPrintf for writing.

Thanks

Rgds,
Amit



>
>    Barry
>
> On Apr 29, 2008, at 8:54 AM, Amit.Itagi at seagate.com wrote:
>
> > Hi,
> >
> > I spent some more time understanding DA's, and how DA's should serve
> > my
> > purpose. Since in the time domain calculation, I will have to
> > scatter from
> > the global vector to the local vector and vice-versa at every
> > iteration
> > step, I have some follow-up questions.
> >
> > 1) Does the scattering involve copying the part stored on the local
> > node as
> > well (i.e. part of the local vector other than the ghost values), or
> > is the
> > local part just accessed by reference ? In the first scenario, this
> > would
> > involve allocating twice the storage for the local part. Also, does
> > the
> > scattering of the local part give a big hit in terms of CPU time ?
> >
> > 2) In the manual, it says "In most cases, several different vectors
> > can
> > share the same communication information (or, in other words, can
> > share a
> > given DA)" and "PETSc currently provides no container for multiple
> > arrays
> > sharing the same distributed array communication; note, however,
> > that the
> > dof parameter handles many cases of interest". I am a bit confused.
> > Suppose
> > I have two arrays having the same layout on the regular grid, can I
> > store
> > the first array data on one vector, and the second array data on the
> > second
> > vector (and have a DA with dof=1, instead of a DA with dof=2), and
> > be able
> > to scatter and update the first vector without scattering/updating the
> > second vector ?
> >
> > Thanks
> >
> > Rgds,
> > Amit
> >
> > owner-petsc-users at mcs.anl.gov wrote on 04/09/2008 04:09:59 PM:
> >
> >> Hi Amit,
> >>
> >> Why do you need two staggered grids? I do EM finite difference
> >> frequency
> >> domain modeling on a staggered grid using just one DA. Works
> >> perfectly
> > fine.
> >> There are some grid points that are not used, but you just set them
> >> to
> > zero
> >> and put a 1 on the diagonal of the coefficient matrix.
> >>
> >>
> >> Randy
> >>
> >>
> >> Amit.Itagi at seagate.com wrote:
> >>> Hi Berend,
> >>>
> >>> A detailed explanation of the finite difference scheme is given
> >>> here :
> >>>
> >>> http://en.wikipedia.org/wiki/Finite-difference_time-domain_method
> >>>
> >>>
> >>> Thanks
> >>>
> >>> Rgds,
> >>> Amit
> >>>
> >>>
> >>>
> >>>
> >
> >>>            Berend van Wachem
> >
> >>>            <berend at chalmers.
> >
> >>>            se>
> > To
> >>>            Sent by:                  petsc-users at mcs.anl.gov
> >
> >>>            owner-petsc-users
> > cc
> >>>            @mcs.anl.gov
> >
> >>>            No Phone Info
> > Subject
> >>>            Available                 Re: DA question
> >
> >>>
> >
> >>>
> >
> >>>            04/09/2008 02:59
> >
> >>>            PM
> >
> >>>
> >
> >>>
> >
> >>>            Please respond to
> >
> >>>            petsc-users at mcs.a
> >
> >>>                 nl.gov
> >
> >>>
> >
> >>>
> >
> >>>
> >>>
> >>>
> >>>
> >>> Dear Amit,
> >>>
> >>> Could you explain how the two grids are attached?
> >>> I am using multiple DA's for multiple structured grids glued
> >>> together.
> >>> I've done the gluing with setting up various IS objects. From the
> >>> multiple DA's, one global variable vector is formed. Is that what
> >>> you
> >>> are looking for?
> >>>
> >>> Best regards,
> >>>
> >>> Berend.
> >>>
> >>>
> >>> Amit.Itagi at seagate.com wrote:
> >>>> Hi,
> >>>>
> >>>> Is it possible to use DA to perform finite differences on two
> > staggered
> >>>> regular grids (as in the electromagnetic finite difference time
> >>>> domain
> >>>> method) ? Surrounding nodes from one grid are used to update the
> >>>> value
> > in
> >>>> the dual grid. In addition, local manipulations need to be done
> >>>> on the
> >>>> nodal values.
> >>>>
> >>>> Thanks
> >>>>
> >>>> Rgds,
> >>>> Amit
> >>>>
> >>>
> >>>
> >>>
> >>
> >
>




More information about the petsc-users mailing list