Thomas:<div><br></div><div>Does this occur only for large matrices?</div><div>Can you dump your matrices into petsc binary files </div><div>(e.g., A.dat, B.dat) and send to us for debugging?</div><div><br></div><div>Lately, we added a new implementation of MatTransposeMatMult() in petsc-dev</div>
<div>which is shown much faster than released MatTransposeMatMult().</div><div>You might give it a try by</div><div>1. install petsc-dev (see <a href="http://www.mcs.anl.gov/petsc/developers/index.html">http://www.mcs.anl.gov/petsc/developers/index.html</a>)</div>
<div>2. run your code with option '<span style="color:rgb(34,34,34);font-family:'courier new',monospace;font-size:13px;background-color:rgb(255,255,255)">-mattransposematmult_</span><span style="color:rgb(34,34,34);font-family:'courier new',monospace;font-size:13px;background-color:rgb(255,255,255)">viamatmatmult 1'</span></div>
<div><span style="color:rgb(34,34,34);font-family:'courier new',monospace;font-size:13px;background-color:rgb(255,255,255)">Let us know what you get.</span></div><div><span style="color:rgb(34,34,34);font-family:'courier new',monospace;font-size:13px;background-color:rgb(255,255,255)"><br>
</span></div><div><span style="color:rgb(34,34,34);font-family:'courier new',monospace;font-size:13px;background-color:rgb(255,255,255)">Hong</span></div><div><br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
My code makes use of the function MatTransposeMatMult, and usually it work fine! For some larger input data, it now stops with a lot of MPI errors:<br>
<br>
fatal error in PMPI_Barrier: Other MPI error, error stack:<br>
PMPI_Barrier(476)..: MPI_Barrier(comm=0x84000001) failed<br>
MPIR_Barrier(82)...:<br>
MPI_Waitall(261): MPI_Waitall(count=9, req_array=0xa787ba0, status_array=0xa789240) failed<br>
MPI_Waitall(113): The supplied request in array element 8 was invalid (kind=0)<br>
Fatal error in PMPI_Barrier: Other MPI error, error stack:<br>
PMPI_Barrier(476)..: MPI_Barrier(comm=0x84000001) failed<br>
MPIR_Barrier(82)...:<br>
mpid_irecv_done(98): read from socket failed - request state:recv(pde)done<br>
<br>
<br>
Here is the stack print from the debugger:<br>
<br>
6, MatTransposeMatMult (matrix.c:8907)<br>
6, MatTransposeMatMult_MPIAIJ_<u></u>MPIAIJ (mpimatmatmult.c:809)<br>
6, MatTransposeMatMultSymbolic_<u></u>MPIAIJ_MPIAIJ (mpimatmatmult.c:1136)<br>
6, PetscGatherMessageLengths2 (mpimesg.c:213)<br>
6, PMPI_Waitall<br>
6, MPIR_Err_return_comm<br>
6, MPID_Abort<br>
<br>
<br>
I use PETSc 3.3-p3. Any idea whether this is or could be related to some bug in PETSc or whether I make wrong use of the function in some way?<span class="HOEnZb"><font color="#888888"><br>
<br>
Thomas<br>
<br>
</font></span></blockquote></div><br></div>