Hi <div><br></div><div>I was wondering if anyone had any experience with using these new matrix formats in PETSc. I have a block aij matrix (with block size 5) and tried converting to either of these types and it just trashed memory for 10 minutes and I then killed it. Matrix assemble is 13 seconds for reference. I then tried to construct the matrix directly with "MatCreateMPIAIJPERM" or "MatCreateMPIAIJCRL". Neither of these functions are wrapped in Fortran. Also the documentation for "MatCreateMPIAIJCRL" is extremely confusing as it says that comm is set to PETSC_COMM_SELF which defeats the purpose of a MPI matrix. </div>
<div><br></div><div>Therefor, two questions:</div><div>1. Have any users seen a performance benefit of either of these formats (in parallel)?</div><div>2. How does one actually efficiently create such matrices? (from Fortran)</div>
<div><br></div><div>On a side note, is it possible somehow to indicate on the manual pages all the functions that are not actually wrapped in Fortran? It would make it easier to know which one you can and can't use instead of having the compiler tell you it can't find a symbol. </div>
<div><br></div><div>Thanks, </div><div>Gaetan</div>