[petsc-users] MATNEST with shell matrices
Amneet Bhalla
mail2amneet at gmail.com
Thu Feb 21 17:16:14 CST 2013
Jed, the thing is that I have nonnative PETSc data to work with. It's from
SAMRAI
library. We have already defined MatShell to define the individual
operators that act
on nonnative vectors; so writing explicit matrices in PETSc would be very
difficult (atleast for me)
and time consuming. What we want is PETSc's algorithm (and NOT the data
structures)
to solve the problem.
On Thu, Feb 21, 2013 at 5:06 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>
> On Thu, Feb 21, 2013 at 4:19 PM, Amneet Bhalla <mail2amneet at gmail.com>wrote:
>
>> Can you guys comment on what example case would be best to start off for
>> shell
>> operators with FieldSplit? The examples I am looking into all start with
>> creating native
>> PETSc matrices and vectors.
>>
>
>
> Amneet, do you already have code that applies all the "blocks" of your
> coupled system or are you starting something new? If it's something new,
> please don't use MatNest/MatShell just yet. If you have tested existing
> code, then wrapping it in MatShell/MatNest is fine. If you are working on
> something new, I recommend the progression below. It will encourage better
> program structure and better debuggability, and will ultimately be faster
> than trying to "skip" steps.
>
> Step 1: Just write a residual and use -snes_mf to solve all matrix-free
> without preconditioning.
>
> Step 2: Assemble an approximate Jacobian and use -snes_mf_operator
>
> Step 3: use fd_coloring (if possible) to see how much solver benefit you
> could gain by implementing the full Jacobian. Also use fieldsplit solvers
> to find out which blocks of the Jacobian are most important to assemble.
>
> Step 4: Implement those blocks of the Jacobian that you need for effective
> preconditioning.
>
> Step 5: Profile, consider those preconditioners that are most effective
> and the suitability of the discretization for matrix-free application. If
> you spend a lot of time/memory in things like MatGetSubMatrix, then add an
> option to use MatNest. If you have overly heavy matrices (in terms of
> memory, bandwidth, or assembly time) that need not be completely assembled
> for effective preconditioning, add an option to use MatShell for those
> parts.
>
>
>
--
Amneet
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130221/806efacd/attachment.html>
More information about the petsc-users
mailing list