[petsc-users] Are there any functions (objects ) for scattering and gathering 64-bit integers just like VecScatter for PetscScalar?

Fande Kong fd.kong at siat.ac.cn
Fri Nov 2 18:08:41 CDT 2012


Thank you. Now, I got it. Sorry for bothering you.

On Fri, Nov 2, 2012 at 5:04 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:

> For the third time, look at the title of the issue I linked. I'm not going
> to answer any more questions that are directly addressed in that issue
> because I don't have time to repeat myself so many times _and_ get useful
> things done.
>
> This is higher priority now because we are using PetscSF in some more
> visible places.
> On Nov 2, 2012 5:57 PM, "Fande Kong" <fd.kong at siat.ac.cn> wrote:
>
>> Yes. I haven't tested my code on MPICH. Because MPICH don't
>> support infiniband in the supercomputer. But I have tested code on mvapich.
>> It also has bugs, when the finite mesh is larger than 50M.
>>
>> Now, I just want to know if you will change to use point-point. If you
>> would not, I will try to rewrite PetscSF for me.
>>
>> Thanks,
>>
>> On Fri, Nov 2, 2012 at 4:41 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>>
>>>
>>> On Nov 2, 2012 5:24 PM, "Fande Kong" <fd.kong at siat.ac.cn> wrote:
>>> >
>>> > I got it. You still didn't fix the bugs that I encountered a few
>>> months ago. All current MPI implementations couldn't support the function
>>> MPI_Accumulate or others well with one-sided communication when data is
>>> large.
>>>
>>> I have not had problems with MPICH.
>>>
>>
>>
>>
>> --
>> Fande Kong
>> ShenZhen Institutes of Advanced Technology
>> Chinese Academy of Sciences
>>
>>


-- 
Fande Kong
ShenZhen Institutes of Advanced Technology
Chinese Academy of Sciences
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121102/190a3694/attachment.html>


More information about the petsc-users mailing list