[petsc-users] PetscFE questions

Julian Andrej juan at tf.uni-kiel.de
Fri Oct 21 02:26:01 CDT 2016


On Thu, Oct 20, 2016 at 5:18 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Thu, Oct 20, 2016 at 9:42 AM, Julian Andrej <juan at tf.uni-kiel.de> wrote:
>>
>> Thanks for the suggestion. I guess DMCreateSubDM can work, but is
>> cumbersome to handle for the normal solution process since the mass
>> matrix for example is not a seperate field.
>
>
> I did not understand what you meant by "parts of the physics". If you just
> want to make a different operator, then swap out the PetscDS from the DM.
> That holds the pointwise functions and discretizations.
>

Yes, its basically a different operator! Thats a really smart design,
i can just create different PetscDS objects and stick them in to
assemble the operator.

  /* Assemble mass operator */
  DMSetDS(dm, ds_mass);
  DMPlexSNESComputeJacobianFEM(dm, dummy, ctx->M, ctx->M, NULL);
  /* Assemble laplacian operator */
  DMSetDS(dm, ds_laplacian);
  DMPlexSNESComputeJacobianFEM(dm, dummy, ctx->J, ctx->J, NULL);

There is one thing that bothers me just a bit. Everytime you call
DMSetDS the old PetscDS object is destroyed and you have to reacreate
the object in case you want to reassemble that operator.

src/dm/interface/dm.c:3889:  ierr = PetscDSDestroy(&dm->prob);CHKERRQ(ierr);

Maybe it is just my specific use case but something to think about.

>>
>> src/snes/examples/tutorials/ex77 handles a seperate field for the
>> nullspace, if anyone is interested in that.
>>
>> An intuitive way was just copying the DM and describing a new problem on
>> it.
>>
>>   DM dm_mass;
>>   PetscDS ds_mass;
>>   Vec dummy;
>>   PetscInt id = 1;
>>   petsc_call(DMCreateGlobalVector(dm, &dummy));
>>   petsc_call(DMClone(ctx->dm, &dm_mass));
>>   petsc_call(DMGetDS(dm_mass, &ds_mass));
>>   petsc_call(PetscDSSetDiscretization(ds_mass, 0, (PetscObject)fe));
>>   petsc_call(PetscDSSetJacobian(ds_mass, 0, 0, mass_kernel, NULL, NULL,
>> NULL));
>>   petsc_call(PetscDSAddBoundary(ds_mass, PETSC_TRUE, "wall", "marker",
>> 0, 0, NULL, (void (*)())ctx->exact_funcs[0], 1, &id, ctx));
>>   petsc_call(DMCreateMatrix(dm_mass, &ctx->M));
>>   petsc_call(DMPlexSNESComputeJacobianFEM(dm_mass, dummy, ctx->M,
>> ctx->M, NULL));
>>
>> is this an intended way to assemble a jacobian based on a weak form?
>> The memory overhead for a DM copy isn't huge on the first sight.
>
>
> Its O(1).
>
>>
>> And a much more important question. Is there any mathematical
>> description how exactly you handle dirichlet boundary conditions here?
>
>
> Right now, you can do two things:
>
>   1) Handle it yourself
>
> or
>
>   2) eliminate particular dofs
>
> If you use 2), these dofs are eliminated from the global vector. They remain
> in the
> local vector, and boundary values are inserted before local vectors are
> passed to
> assembly routines.
>
>    Matt
>

Thank you again for your help and suggestions.

Regards
Julian

>>
>> On first sight it looks like condensing the nodes only to
>> non-essential nodes and then projecting them back in the solution
>> vector. If thats teh case I don't understand how you "augment" the
>> solution with the boundary nodes.
>>
>> Regards
>> Julian
>>
>>
>> On Wed, Oct 19, 2016 at 11:51 AM, Matthew Knepley <knepley at gmail.com>
>> wrote:
>> > On Tue, Oct 18, 2016 at 7:38 AM, Julian Andrej <juan at tf.uni-kiel.de>
>> > wrote:
>> >>
>> >> Hi,
>> >>
>> >> i have general question about PetscFE. When i want to assemble certain
>> >> parts of physics separately, how can i do that? I basically want to
>> >> assemble matrices/vectors from the weak forms on the same DM (and
>> >> avoid copying the DM) and use them afterwards. Is there a convenient
>> >> way for doing that?
>> >>
>> >> The "workflow" i'm approaching is something like:
>> >>
>> >> - Setup the DM
>> >> - Setup discretization (spaces and quadrature) for each weak form i
>> >> want to compute
>> >> - Compute just the weak form i want right now for a specific
>> >> discretization and field.
>> >>
>> >> The reason is i need certain parts of the "complete" Jacobian for
>> >> computations of eigenproblems and like to avoid computing those more
>> >> often than needed.
>> >
>> >
>> > The way I envision this working is to use DMCreateSubDM(). It should
>> > extract
>> > everything correctly for the subset of fields you select. However, I
>> > have
>> > not
>> > extensively tested, so if something is wrong let me know.
>> >
>> >   Thanks,
>> >
>> >      Matt
>> >
>> >>
>> >> Regards
>> >> Julian
>> >
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> > experiments
>> > is infinitely more interesting than any results to which their
>> > experiments
>> > lead.
>> > -- Norbert Wiener
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments
> is infinitely more interesting than any results to which their experiments
> lead.
> -- Norbert Wiener


More information about the petsc-users mailing list