[petsc-users] Allocating memory for off-diagonal matrix blocks when using DMComposite

Dave May dave.mayhem23 at gmail.com
Tue Apr 26 16:00:45 CDT 2016

On 26 April 2016 at 16:50, Gautam Bisht <gbisht at lbl.gov> wrote:

> I want to follow up on this old thread. If a user knows the exact fill
> pattern of the off-diagonal block (i.e. d_nz+o_nz  or d_nnz+o_nnz ), can
> one still not preallocate memory for the off-diagonal matrix when using
> DMComposite?

You are always free to over-ride the method
with your own custom code to create
and preallocate the matrix.

You will need to add
  #include <petsc/private/dmimpl.h>
into your source file to expose the contents of the DM object.

> -Gautam.
> On Tue, Sep 30, 2014 at 3:48 PM, Jed Brown <jed at jedbrown.org> wrote:
>> Gautam Bisht <gbisht at lbl.gov> writes:
>> > Hi,
>> >
>> > The comment on line 419 of SNES ex 28.c
>> > <
>> http://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex28.c.html
>> >
>> > says
>> > that the approach used in this example is not the best way to allocate
>> > off-diagonal blocks. Is there an example that shows a better way off
>> > allocating memory for off-diagonal matrix blocks when using DMComposite?
>> The problem is that the allocation interfaces specialize on the matrix
>> type.  Barry wrote a DMCompositeSetCoupling(), but there are no
>> examples.  This is something that PETSc needs to improve.  I have been
>> unsuccessful at conceiving a _simple_ yet flexible/extensible solution.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160426/6d42b2a1/attachment.html>

More information about the petsc-users mailing list