craig-tanis at utc.edu
Mon Nov 23 15:12:02 CST 2009
I have an existing MPI code that builds a linear system corresponding to an unstructured mesh. I'm hoping that I can change my code to work with PETSc, but I'm not sure the domain decomposition scheme is compatible.
The big problem seems to be that my domains are not guaranteed to have contiguous global node ids. How can I specify explicitly which processor owns which node/vector element (for the purposes of ghost-node synchronization)?
Thanks for your help,
More information about the petsc-users