<div dir="ltr">Dear Lisandro:<br><br>Thank you very much for your help. <br>Our basic idea is <br>main()<br>{<p>step 1: Initialize Petsc and Slepc;</p>
<p>step 2: Use Petsc; (use all N nodes in one process group)</p><p>step
3: Use Slepc; (N nodes is divided into M process groups. these groups
are indepedent. However, they need to communicate with each other)</p><p>step 4: Use Petsc; (use all N nodes in one process group)</p>
<p>}</p>Assuming, the dimension of the whole matrix is N*N when using all Nodes in one process group. At the end of step 2, I need to get M different matrices and vectors (I should be able to make them be stored in M single different nodes which belong to M different process group.). Before step3, I need to scatter M matrices and vectors in M different process groups. Then, I can compute based on M matrices and vectors in M subcommunication domains. After calculating, I need to collect M solution vectors back to their parent communication domain. In Step4, I use this solution to further compute. Could you give me any further advice? thanks again.<br>
<br>Regards,<br>Yujie<br><br><div class="gmail_quote">On Thu, Sep 18, 2008 at 5:44 AM, Lisandro Dalcin <span dir="ltr"><<a href="mailto:dalcinl@gmail.com">dalcinl@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div class="Ih2E3d">On Wed, Sep 17, 2008 at 9:22 PM, Yujie <<a href="mailto:recrusader@gmail.com">recrusader@gmail.com</a>> wrote:<br>
> Thank you very much, Lisandro. You are right. It look like a little<br>
> difficult to "transfer" data from one node to "N" nodes or from N nodes to M<br>
> nodes. My method is to first send all the data in a node and to redistribute<br>
> it in "N" or "M" nodes. do you have any idea about it? is it time-consuming?<br>
> In Petsc, how to support such type of operations? thanks a lot.<br>
<br>
</div>Mmm.. I believe there is not way to do that with PETSc. You just have<br>
to make MPI calls. Perhaps if you can give me a bit more of details<br>
about your communication patters, then I can give you a good<br>
suggestion.<br>
<div><div></div><div class="Wj3C7c"><br>
><br>
> Regards,<br>
><br>
> Yujie<br>
><br>
> On Wed, Sep 17, 2008 at 5:05 PM, Lisandro Dalcin <<a href="mailto:dalcinl@gmail.com">dalcinl@gmail.com</a>> wrote:<br>
>><br>
>> A long as you create your SLEPc objects with the appropriate<br>
>> communicator (ie. the one obtained with MPI_Comm_split), then all<br>
>> should just work. Of course, you will have to make appropriate MPI<br>
>> calls to 'transfer' data from your N group to the many M groups, and<br>
>> the other way to collect results.<br>
>><br>
>><br>
>> On Wed, Sep 17, 2008 at 8:25 PM, Yujie <<a href="mailto:recrusader@gmail.com">recrusader@gmail.com</a>> wrote:<br>
>> > You are right :). I am thinking the whole framwork for my codes. thank<br>
>> > you,<br>
>> > Lisandro. In Step 3, there are different M slepc-based process groups,<br>
>> > which<br>
>> > should mean M communication domain for Petsc and Slepc (I have created a<br>
>> > communication domain for them) is it ok? thanks again.<br>
>> ><br>
>> > Regards,<br>
>> ><br>
>> > Yujie<br>
>> ><br>
>> > On Wed, Sep 17, 2008 at 4:08 PM, Lisandro Dalcin <<a href="mailto:dalcinl@gmail.com">dalcinl@gmail.com</a>><br>
>> > wrote:<br>
>> >><br>
>> >> I bet you have not even tried to actually implent and run this :-).<br>
>> >><br>
>> >> This should work. If not, I would consider that a bug. Let us know of<br>
>> >> any problem you have.<br>
>> >><br>
>> >><br>
>> >> On Wed, Sep 17, 2008 at 7:59 PM, Yujie <<a href="mailto:recrusader@gmail.com">recrusader@gmail.com</a>> wrote:<br>
>> >> > Hi, Petsc Developer:<br>
>> >> ><br>
>> >> > Currently, I am using Slepc for my application. It is based on Petsc.<br>
>> >> ><br>
>> >> > Assuming I have a cluster with N nodes.<br>
>> >> ><br>
>> >> > My codes are like<br>
>> >> ><br>
>> >> > main()<br>
>> >> ><br>
>> >> > {<br>
>> >> ><br>
>> >> > step 1: Initialize Petsc and Slepc;<br>
>> >> ><br>
>> >> > step 2: Use Petsc; (use all N nodes in one process group)<br>
>> >> ><br>
>> >> > step 3: Use Slepc; (N nodes is divided into M process groups. these<br>
>> >> > groups<br>
>> >> > are indepedent. However, they need to communicate with each other)<br>
>> >> ><br>
>> >> > step 4: Use Petsc; (use all N nodes in one process group)<br>
>> >> ><br>
>> >> > }<br>
>> >> ><br>
>> >> > My method is:<br>
>> >> ><br>
>> >> > when using Slepc, MPI_Comm_split() is used to divide N nodes into M<br>
>> >> > process<br>
>> >> > groups which means to generate M communication domains. Then,<br>
>> >> > MPI_Intercomm_create() creates inter-group communication domain to<br>
>> >> > process<br>
>> >> > the communication between different M process groups.<br>
>> >> ><br>
>> >> > I don't know whether this method is ok regarding Petsc and Slepc.<br>
>> >> > Because<br>
>> >> > Slepc is developed based on Petsc. In Step 1, Petsc and Slepc is<br>
>> >> > initialized<br>
>> >> > with all N nodes in a communication domain. Petsc in Step 2 uses this<br>
>> >> > communication domain. However, in Step 3, I need to divide all N<br>
>> >> > nodes<br>
>> >> > and<br>
>> >> > generate M communication domains. I don't know how Petsc and Slepc<br>
>> >> > can<br>
>> >> > process this change? If the method doesn't work, could you give me<br>
>> >> > some<br>
>> >> > advice? thanks a lot.<br>
>> >> ><br>
>> >> > Regards,<br>
>> >> ><br>
>> >> > Yujie<br>
>> >><br>
>> >><br>
>> >><br>
>> >> --<br>
>> >> Lisandro Dalcín<br>
>> >> ---------------<br>
>> >> Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)<br>
>> >> Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)<br>
>> >> Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)<br>
>> >> PTLC - Güemes 3450, (3000) Santa Fe, Argentina<br>
>> >> Tel/Fax: +54-(0)342-451.1594<br>
>> >><br>
>> ><br>
>> ><br>
>><br>
>><br>
>><br>
>> --<br>
>> Lisandro Dalcín<br>
>> ---------------<br>
>> Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)<br>
>> Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)<br>
>> Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)<br>
>> PTLC - Güemes 3450, (3000) Santa Fe, Argentina<br>
>> Tel/Fax: +54-(0)342-451.1594<br>
>><br>
><br>
><br>
<br>
<br>
<br>
</div></div>--<br>
<div><div></div><div class="Wj3C7c">Lisandro Dalcín<br>
---------------<br>
Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)<br>
Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)<br>
Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)<br>
PTLC - Güemes 3450, (3000) Santa Fe, Argentina<br>
Tel/Fax: +54-(0)342-451.1594<br>
<br>
</div></div></blockquote></div><br></div>