[petsc-users] Parallel vector with shared memory in Fortran

Francesco Migliorini francescomigliorini93 at gmail.com
Mon Jun 5 11:20:55 CDT 2017


Dear Stefano,
Thank you for your answer. I tried to use VecScatterCreateToAll as you
suggested but it does not work since the first processor can only view its
part of the vector. Here's how I managed the code:

Vec    fePS
VecScatter    Scatter
(...)
call VecScatterCreateToAll(feP,Scatter,fePS,perr)
call VecScatterBegin(Scatter,feP,fePS,INSERT_VALUES,SCATTER_FORWARD,perr)
call VecScatterEnd(Scatter,feP,fePS,INSERT_VALUES,SCATTER_FORWARD,perr)
call VecScatterDestroy(Scatter,perr)
call VecDestroy(fePS,perr)

As I said, after this piece of code, if I print all the entries of feP from
one processor, the values are correct if they belong to the part of the
processor randon values.

Bests,
Francesco

2017-06-05 16:44 GMT+02:00 Stefano Zampini <stefano.zampini at gmail.com>:

> Sorry, bad copy and paste
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/
> VecScatterCreateToAll.html
>
> Il 05 Giu 2017 4:43 PM, "Stefano Zampini" <stefano.zampini at gmail.com> ha
> scritto:
>
>> petsc-current/docs/manualpages/Vec/VecScatterCreateToAll.html
>>
>> Il 05 Giu 2017 4:12 PM, "Francesco Migliorini" <
>> francescomigliorini93 at gmail.com> ha scritto:
>>
>>> Hello there!
>>>
>>> I am working with an MPI code in which I should create a petsc vector
>>> such that all the processes can access to all its entries. So, I tried with
>>> VecCreateShared but it does not work with my machine. Then I tried
>>> VecCreateMPI but it seems to me that it does not change anything from the
>>> usual VecCreate. Finally I found the scatter commands but the examples are
>>> a bit tricky. So, are there any other way? If no, could someone please show
>>> me how to use scatter in this simple code?
>>>
>>> Vec  feP    !The vector to be shared with all the processes
>>> (...)
>>> mpi_np = 2    !The number of processes
>>> ind(1) = 10   !The global dimension of the vector
>>> call VecCreate(PETSC_COMM_WORLD,feP,perr)
>>> call VecSetSizes(feP,PETSC_DECIDE,ind,perr)
>>> call VecSetFromOptions(feP,perr)
>>> (...)    !Here feP is filled in
>>> call VecAssemblyBegin(feP,perr)
>>> call VecAssemblyEnd(feP,perr)
>>>
>>> Many thanks,
>>> Francesco
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170605/6beed0bf/attachment.html>


More information about the petsc-users mailing list