<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><div class=""><br class=""></div>MatCreateMPIAIJWithArrays() and MatUpdateMPIAIJWithArrays() may be suitable for your use case.<div class=""><br class=""></div><div class="">This should also be much more efficient in moving the matrix from your code to PETSc's format.<br class=""><div class=""><br class=""></div><div class=""> Barry</div><div class=""><br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Feb 15, 2022, at 4:13 PM, Bojan Niceno <<a href="mailto:bojan.niceno.scientist@gmail.com" class="">bojan.niceno.scientist@gmail.com</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div dir="ltr" class="">Dear PETSc users,<div class=""><br class=""></div><div class=""><br class=""></div><div class="">I have an in-house computational fluid dynamics (CFD) solver, written in Fortran 2008, parallelized with MPI with its own home-grown suite of linear solvers. The code is unstructured, performs domain decomposition with METIS and all communication buffers, I mean connectivity between processors, has been properly worked out.</div><div class=""><br class=""></div><div class="">A couple of weeks back, I decided to try out the PETSc suite of solvers. After some initial setbacks, I managed to compile my code with PETSc and have the sequential version working fine :-)</div><div class=""><br class=""></div><div class="">I have essentially using the following PETSc routines to get the code solving linear systems with PETSc: </div><div class=""><br class=""></div><div class=""><i class=""> </i>I set up the working space as follows:</div><div class=""><br class=""></div><div class=""><font face="monospace" class=""> call PetscInitialize(PETSC_NULL_CHARACTER, Pet % petsc_err)<br class=""></font></div><div class=""><font face="monospace" class=""> call MatCreateSeqAij(PETSC_COMM_SELF, ...<br class=""></font></div><div class=""><font face="monospace" class=""> </font><font face="monospace" color="#ff0000" class="">call MatSeqAijSetColumnIndices(....</font></div><div class=""><font face="monospace" class=""> call VecCreateSeq(PETSC_COMM_SELF, ... ! Create Vector x</font></div><div class=""><font face="monospace" class=""> call VecCreateSeq(PETSC_COMM_SELF, ... ! Create Vector b</font></div><div class=""><font face="monospace" class=""> call KSPCreate(PETSC_COMM_SELF, ...</font></div><div class=""><font face="monospace" class=""><br class=""></font></div><div class=""><font face="arial, sans-serif" class="">Then in order to solve a system, I do:</font></div><div class=""><font face="arial, sans-serif" class=""><br class=""></font></div><div class=""><font face="monospace" class=""> call MatSetValue(Pet % petsc_A, ! Inside a loop through matrix entries<br class=""> i-PETSC_ONE, <br class=""> k-PETSC_ONE, ...</font></div><font face="monospace" class=""> call MatAssemblyBegin(Pet % petsc_A, MAT_FINAL_ASSEMBLY, Pet % petsc_err)<br class=""> call MatAssemblyEnd (Pet % petsc_A, MAT_FINAL_ASSEMBLY, Pet % petsc_err)<br class=""></font><div class=""><font face="monospace" class=""> <br class=""></font></div><div class=""><font face="monospace" class=""> call VecSetValue(Pet % petsc_x, ... ! Fill up x</font></div><div class=""><div class=""><font face="monospace" class=""> call VecSetValue(Pet % petsc_b, ... ! Fill up b</font></div><font face="monospace" class=""><div class=""><font face="monospace" class=""><br class=""></font></div> call KSPSetType(Pet % petsc_ksp ... ! Set solver<br class=""></font></div><div class=""><font face="monospace" class=""> call KSPGetPC(Pet % petsc_ksp, ... ! get preconditioner context</font></div><font face="monospace" class=""> call PCSetType(Pet % petsc_pc, ... ! Set preconditioner</font><div class=""><font face="monospace" class=""><br class=""> call KSPSetFromOptions(Pet % petsc_ksp, Pet % petsc_err) <br class=""> call KSPSetUp (Pet % petsc_ksp, Pet % petsc_err) <br class=""></font><div class=""><font face="monospace" class=""> </font></div><div class=""><font face="monospace" class=""> ! Finally solve</font></div><div class=""><font face="monospace" class=""> call KSPSolve(Pet % petsc_ksp, ...</font></div><div class=""><font face="monospace" class=""><br class=""></font></div><div class=""><font face="arial, sans-serif" class="">Once this was up and running, I thought that in order to have the parallel version I will merely have to replace the "Seq" versions of the above functions, with their parallel counterparts. I was expecting to find the red function (</font><font face="monospace" class="">MatSeqAijSetColumnIndices</font><font face="arial, sans-serif" class="">) for parallel runs, but it doesn't seem to exist. I have found non-seq versions of some other functions (</font><font face="monospace" class="">MatCreateAij</font><font face="arial, sans-serif" class="">, </font><font face="monospace" class="">VecCreateSeq</font><font face="arial, sans-serif" class="">), but not something like </font><font face="monospace" class="">MatAijSetColumnIndices</font><font face="arial, sans-serif" class="">, which surprised me a bit, because I have this information in my code.</font></div><div class=""><span style="font-family:arial,sans-serif" class=""> </span></div><div class=""><div class=""><font face="arial, sans-serif" class="">Is there a parallel counterpart of this function, and if there is none, what should it be replaced with? I understand that I will have to provide non-zeros in buffers (o_nnz), which is not a big issue, but how to provide information on columns for parallel version is not clear to me. In a nutshell, I would need a hint on which of the above functions could remain the same in parallel, and which should be replaced and with what?</font></div><div class=""><br class=""></div></div></div><div class=""> Cheers,</div><div class=""><br class=""></div><div class=""> Bojan</div><div class=""><br class=""></div></div>
</div></blockquote></div><br class=""></div></div></body></html>