[petsc-users] MATNEST with shell matrices

Jed Brown jedbrown at mcs.anl.gov
Thu Feb 21 17:06:48 CST 2013


On Thu, Feb 21, 2013 at 4:19 PM, Amneet Bhalla <mail2amneet at gmail.com>wrote:

> Can you guys comment on what example case would be best to start off for
> shell
> operators with FieldSplit? The examples I am looking into all start with
> creating native
> PETSc matrices and vectors.
>


Amneet, do you already have code that applies all the "blocks" of your
coupled system or are you starting something new? If it's something new,
please don't use MatNest/MatShell just yet. If you have tested existing
code, then wrapping it in MatShell/MatNest is fine. If you are working on
something new, I recommend the progression below. It will encourage better
program structure and better debuggability, and will ultimately be faster
than trying to "skip" steps.

Step 1: Just write a residual and use -snes_mf to solve all matrix-free
without preconditioning.

Step 2: Assemble an approximate Jacobian and use -snes_mf_operator

Step 3: use fd_coloring (if possible) to see how much solver benefit you
could gain by implementing the full Jacobian. Also use fieldsplit solvers
to find out which blocks of the Jacobian are most important to assemble.

Step 4: Implement those blocks of the Jacobian that you need for effective
preconditioning.

Step 5: Profile, consider those preconditioners that are most effective and
the suitability of the discretization for matrix-free application. If you
spend a lot of time/memory in things like MatGetSubMatrix, then add an
option to use MatNest. If you have overly heavy matrices (in terms of
memory, bandwidth, or assembly time)  that need not be completely assembled
for effective preconditioning, add an option to use MatShell for those
parts.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130221/c5708142/attachment-0001.html>


More information about the petsc-users mailing list