[petsc-users] about array in VecCreateMPIWithArray()

Matthew Knepley knepley at gmail.com
Fri Apr 13 11:01:34 CDT 2012


On Fri, Apr 13, 2012 at 11:59 AM, recrusader <recrusader at gmail.com> wrote:

> Thanks, Matt.
>
> If I have one array, each processor has a copy of all the values.
> one possible option is
>

1) If all processors have all values the code is already not scalable

2) If everyone has all values, only set the values you own

   Matt


> VecCreateMPI();
> VecSetValues();
> VecAssemblyBegin();
> VecAssemblyEnd();
>
> any better idea?
>
> if I have one array, only one processor has a copy of all the values.
> How to generate a MPI vector?
>
> Best,
> Yujie
>
> On Fri, Apr 13, 2012 at 10:38 AM, Matthew Knepley <knepley at gmail.com>wrote:
>
>> On Fri, Apr 13, 2012 at 11:37 AM, recrusader <recrusader at gmail.com>wrote:
>>
>>> Dear PETSc developers,
>>>
>>> my question is what type of array is need for VecCreateMPIWithArray?
>>> it is parallel distributed, that each processor holds local values of
>>> the array? or each processor has a copy of all the values of the array?
>>>
>>>
>> Only local values.
>>
>>    Matt
>>
>>
>>> PetscErrorCode  VecCreateMPIWithArray(MPI_Comm comm,PetscInt n,PetscInt N,const PetscScalar array[],Vec *vv)
>>>
>>> I didn't find more information about the array from the manual.
>>>
>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecCreateMPIWithArray.html
>>>
>>> Thanks a lot,
>>> Yujie
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120413/74e1b144/attachment.htm>


More information about the petsc-users mailing list