[petsc-users] MPI Derived Data Type and Non Blocking MPI Send/Recieve

Jed Brown jedbrown at mcs.anl.gov
Thu Sep 6 17:56:33 CDT 2012


Numeric data that the solver sees should be stored in Vecs. You can put
other scalars in Vecs if you like.
On Sep 6, 2012 5:48 PM, "Zhenglun (Alan) Wei" <zhenglun.wei at gmail.com>
wrote:

>  Dear Dr. Brown,
>      I'm not quite familiar with VecScatter. I just read its explanation;
> it seems requires that my data is stored as a form of vectors (is it the
> vector in PETSc?). However, my data are stored as arrays in C program.
>      Is that any problem in MPI or it is likely a problem of my code?
>
> thanks,
> Alan
> On 9/6/2012 5:44 PM, Jed Brown wrote:
>
> Are you familiar with VecScatter?
> On Sep 6, 2012 5:38 PM, "Zhenglun (Alan) Wei" <zhenglun.wei at gmail.com>
> wrote:
>
>> Dear All,
>>      I hope you're having a nice day.
>>      I met a memory problem for MPI data communication. I guess here is a
>> good place to ask this question since you guys are experts and may
>> experienced the same problem before.
>>      I used the MPI derived data type (MPI_Type_contiguous,
>> MPI_Type_vector and MPI_Type_indexed) to communicate data for a simulation
>> of 3D problem. The communication is fine, as I checked every single data it
>> sent and received. However, the problem is that the memory keeps increasing
>> while communication. Therefore, I tested each of these three types.
>> MPI_Type_contiguous does not have any problem; while MPI_Type_vector and
>> MPI_Type_indexed have problem of memory accumulation. I tried to use
>> MPI_Type_free, but it does not help. Have anyone experienced this problem
>> before?
>>      Would this be related to the non-blocking MPI communication
>> (MPI_Isend and MPI_Irecv). I have to use this non-blocking communication
>> since the blocking communication is extremely slow when it has a lot of
>> data involved in the communication.
>>      Is there any alternative way in PETSc that could do the similar work
>> of MPI derived types?
>>
>> thanks,
>> Alan
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120906/2180a6e8/attachment.html>


More information about the petsc-users mailing list