Thank you Jed. That was indeed the problem. I installed a separate MPI for PETSc/SLEPc, but was running my program with a default, already installed one.<br><br>Now, I have a different question. What I want to do is this:<br>
<br>1. Only 1 process, say root, calculates the matrix in SeqAIJ format<br>2. Then root creates the EPS context, eps and initializes,sets parameters, problem type,etc. properly<br>3. After this the root process broadcasts this eps object to other processes<br>
4. I use EPSSolve to solve for eigenvalues (all process together in cooperation resulting in memory distribution)<br>5. I get the results from root<br><br>is this possible ? I am not able to broadcast the EPS object, because it is not an MPI_DataType. Is there any PETSc/SLEPc function for this ? I am avoiding using MPIAIJ because that will mean making many changes in the existing code, including the numerous write(*,*) statements (i would have to convert them to PetscPrint in FORTRAN or something like that). <br>
So I want a single process to handle matrix generation and assembly, but want to solve the eigenproblem in parallel by different processes. Running the subroutine EPSSolve in parallel and hence distribute memory is the only reason why I want to use MPI.<br>
<br>Thanks a lot !!<br><br>Shitij<br><br><div class="gmail_quote">On 8 August 2011 11:05, Jed Brown <span dir="ltr"><<a href="mailto:jedbrown@mcs.anl.gov">jedbrown@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im"><div class="gmail_quote">On Mon, Aug 8, 2011 at 00:29, Shitij Bhargava <span dir="ltr"><<a href="mailto:shitij.cse@gmail.com" target="_blank">shitij.cse@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<span style="color:rgb(102, 102, 102)"><font color="#000000">I</font></span><i><span style="color:rgb(102, 102, 102)"><font color="#000000"> </font></span></i><span style="color:rgb(102, 102, 102)"><font color="#000000">ran it with:<br>
<br>mpirun -np 2 ./slepcEigenMPI -eps_monitor<br><br>I didnt do exactly what you said, because the matrix generation part in the actual program is quite time consuming itself. But I assume what I am doing is equivalent to what you meant to do? Also, I put MPD as PETSC_DECIDE, because I didnt know what to put it for this matrix dimension.<br>
<br>This is the output I get: (part of the output)<br><i style="color:rgb(102, 102, 102)">MATRIX ASSMEBLY DONE !!!!!!!! <br><br>MATRIX ASSMEBLY DONE !!!!!!!! <br><br> 1 EPS nconv=98 first unconverged value (error) 1490.88 (1.73958730e-05)<br>
1 EPS nconv=98 first unconverged value (error) 1490.88 (1.73958730e-05)<br> 2 EPS nconv=282 first unconverged value (error) 3.04636e-27 (2.49532175e-04)<br> 2 EPS nconv=282 first unconverged value (error) 3.04636e-27 (2.49532175e-04)</i></font></span></blockquote>
</div><br></div><div>The most likely case is that you have more than one MPI implementation installed and that you are running with a different implementation than you built with. Compare the outputs:</div><div><br></div>
<div>$ ldd ./slepcEigenMPI</div>
<div>$ which mpirun</div>
</blockquote></div><br>