[petsc-users] Data Distribution from zero-rank process

Matthew Knepley knepley at gmail.com
Tue Aug 12 12:10:38 CDT 2014


On Tue, Aug 12, 2014 at 11:55 AM, Mari Pecha <pecha.mari at gmail.com> wrote:

>  Heurekaaaa, it works!!!!! Thank you so much! :)
>

Cool.


> My final code I've add to attachements for checking.
>
> So, can I ask now how can I create my own scatter?
>

I would not worry about that until after you have profiled your code to see
what takes
the most time, using -log_summary.

  Thanks,

     Matt


>
> Thank you for your time
> Mari
>
> On 12.8.2014 18:40, Matthew Knepley wrote:
>
>  On Tue, Aug 12, 2014 at 11:30 AM, Mari Pecha <pecha.mari at gmail.com>
> wrote:
>
>>  So, I've tried options with VecScatterCreateToZero but I got error
>> message
>>
>> [0]PETSC ERROR: Nonconforming object sizes
>> [0]PETSC ERROR: Vector wrong size 10 for scatter 2 (scatter reverse and
>> vector to != ctx from size)
>>
>> I'm not sure what is wrong. Could you help me? My code I've add to
>> attachements.
>>
>
>  Your vector arguments for the scatter are reversed. You are scattering
> from the local vector with pixel data
> to the global vector.
>
>
>>  Thank you for your time
>> Mari
>>
>> P.S: I use PETSc 3.5 and SCATTER_BACKWARD is changed to SCATTER_REVERSE.
>>
>
>  Yes, that is right.
>
>      Matt
>
>
>>  On 12.8.2014 17:53, Matthew Knepley wrote:
>>
>>  On Tue, Aug 12, 2014 at 9:22 AM, Mari Pecha <pecha.mari at gmail.com>
>> wrote:
>>
>>>  Okay. I understand my explanation it sees to be chaotic. I try to
>>> describe my problem on simple simple.
>>>
>>> Process with rank zero loads grayscale image from OpenCV library then
>>> I'd like to distribute values of pixels to grid (DMDA). And I can't find
>>> any solution for this problem.
>>>
>>
>>  The right thing to do is to create a custom scatter for this case.
>> However, you can do this in two parts. First,
>> read it in on proc 0, and then scatter to all procs using
>>
>>
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecScatterCreateToZero.html
>>
>>  where you SCATTER_BACKWARD. Then use
>>
>>
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMDANaturalToGlobalBegin.html
>>
>>  to permute from the natural ordering that you read in to the PETSc
>> ordering. You could do this in one
>> step if you allow each process to read its piece of the vector
>> independently.
>>
>>     Thanks,
>>
>>        Matt
>>
>>
>>>
>>> Thank you
>>> Mari
>>>
>>> On 12.8.2014 16:12, Matthew Knepley wrote:
>>>
>>>  On Tue, Aug 12, 2014 at 8:53 AM, Mari Pecha <pecha.mari at gmail.com>
>>> wrote:
>>>
>>>> Good morning,
>>>>
>>>> I'm Mari and I'm beginner PETSc user and I'd like to ask about tough
>>>> problem, especially for me.
>>>> I solve problem of data-set distribution, which are saved only on
>>>> zero-rank process, to DMDA global vector. It means that every process
>>>> stores a appropriate  subset of data-set. I used
>>>> DMDAGlobalToNaturalAllCreate and DMDANaturalAllToGlobalCreate function, and
>>>> then VecScatterBegin and VecScatterEnd functions ... see please code below
>>>>
>>>
>>>  I have no idea what you really want to do. How about describing your
>>> problem with
>>> 2 procs and 3 values per proc.
>>>
>>>
>>>>     ...
>>>>
>>>>     ierr = DMDAGlobalToNaturalAllCreate( da, &tolocalall   );
>>>> CHKERRQ(ierr);
>>>>     ierr = DMDANaturalAllToGlobalCreate( da, &fromlocalall );
>>>> CHKERRQ(ierr);
>>>>
>>>>     if( rank == 0  )
>>>>     {
>>>>
>>>>         ierr = VecCreateSeq( PETSC_COMM_SELF, SIZE, &localall );
>>>>
>>>>         ierr = VecScatterBegin( tolocalall, x, localall, ADD_VALUES,
>>>> SCATTER_FORWARD_LOCAL ); CHKERRQ(ierr);
>>>>         ierr = VecScatterEnd(   tolocalall, x, localall, ADD_VALUES,
>>>> SCATTER_FORWARD_LOCAL ); CHKERRQ(ierr);
>>>>
>>>
>>>  This will not work since you have a collective call (VecScatterBegin)
>>> inside if(!rank)
>>>
>>>     Matt
>>>
>>>
>>>>         ierr = VecGetArray( x, &vlocal ); CHKERRQ(ierr);
>>>>
>>>>         PetscInt s;
>>>>         VecGetSize( localall, &s );
>>>>
>>>>         ierr = VecView( localall, PETSC_VIEWER_STDOUT_SELF ); CHKERRQ(
>>>> ierr );
>>>>
>>>>         //create data on zero rank process
>>>>         for ( int i = 0; i < s; i++) *vlocal++ = i;
>>>>
>>>>         ierr = VecRestoreArray( localall, &vlocal ); CHKERRQ(ierr);
>>>>
>>>>         ierr = VecScatterBegin(fromlocalall, localall, x, ADD_VALUES,
>>>> SCATTER_FORWARD_LOCAL ); CHKERRQ(ierr);
>>>>         ierr = VecScatterEnd(  fromlocalall, localall, x, ADD_VALUES,
>>>> SCATTER_FORWARD_LOCAL ); CHKERRQ(ierr);
>>>>     }
>>>>
>>>>     ....
>>>>
>>>> But the piece of code gets and distributes only vector-values belong to
>>>> zero-rank process. So, I haven't any idea how can I solve my problem only
>>>> with PETSc functions. I'd like to get all values from global vector to
>>>> zero-rank process and put back all data-set only from zero-rank to global
>>>> vector. Can you help me, please?
>>>>
>>>> Thanks for your response
>>>> Mari
>>>>
>>>
>>>
>>>
>>>  --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>>
>>>
>>
>>
>>  --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>>
>>
>
>
>  --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140812/ea83723b/attachment.html>


More information about the petsc-users mailing list