[petsc-users] Vec sizing using DMDA

Matthew Knepley knepley at gmail.com
Fri Apr 24 10:10:14 CDT 2020


On Fri, Apr 24, 2020 at 10:47 AM Antoine Côté <Antoine.Cote3 at usherbrooke.ca>
wrote:

> Hi,
>
> Thanks for the fast response! An array of Vec would indeed solve my
> problem. I just don't know how to allocate it. Say I have a Vec U of the
> right size (created with a DMDA), and nlc = 4 load cases. How should I
> allocate and initialize the array?
>

Vec *rhs;
PetscInt i;

ierr = PetscMalloc1(N, &rhs);CHKERRQ(ierr);
for (i = 0; i < N; ++I) {
  ierr = DMCreateGlobalVector(dm, &rhs[i]);CHKERRQ(ierr);
}
/* Access vector with rhs[i] */
for (i = 0; i < N; ++I) {
  ierr = DMDestroyGlobalVector(dm, &rhs[i]);CHKERRQ(ierr);
}
ierr = PetscFree(rhs);

  Thanks,

     Matt

Best regards
>
> ------------------------------
> *De :* Matthew Knepley <knepley at gmail.com>
> *Envoyé :* 23 avril 2020 15:23
> *À :* Antoine Côté <Antoine.Cote3 at USherbrooke.ca>
> *Cc :* petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Objet :* Re: [petsc-users] Vec sizing using DMDA
>
> On Thu, Apr 23, 2020 at 12:12 PM Antoine Côté <
> Antoine.Cote3 at usherbrooke.ca> wrote:
>
> Hi,
>
> I'm using a C++/PETSc program to do Topological Optimization. A finite
> element analysis is solved at every iteration of the optimization.
> Displacements U are obtained using KSP solver. U is a Vec created using a
> 3D DMDA with 3 DOF (ux, uy, uz). Boundary conditions are stored in Vec N,
> and forces in Vec RHS. They also have 3 DOF, as they are created using
> VecDuplicate on U.
>
> My problem : I have multiple load cases (i.e. different sets of boundary
> conditions (b.c.) and forces). Displacements U are solved for each load
> case. I need to extract rapidly the b.c. and forces for each load case
> before solving.
>
> One way would be to change the DOF of the DMDA (e.g. for 8 load cases, we
> could use 3*8=24 DOF). Problem is, prior solving, we would need to loop on
> nodes to extract the b.c. and forces, for every node, for every load case
> and for every iteration of the optimization. This is a waste of time, as
> b.c. and forces are constant for a given load case.
>
> A better way would be to assemble b.c. and forces for every load case
> once, and read them afterwards as needed. This is currently done using a
> VecDuplicate on U to create multiple vectors N and RHS (N_0, N_1, RHS_0,
> RHS_1, etc.). Those vectors are hard coded, and can only solve a set number
> of load cases.
>
> I'm looking for a way to allocate dynamically the number of N and RHS
> vectors. What I would like :
> Given nlc, the number of load cases and nn, the number of nodes in the
> DMDA. Create matrices N and RHS of size (DOF*nn lines, nlc columns). While
> optimizing : for every load case, use N[all lines, current load case
> column] and RHS[all lines, current load case column], solve with KSP,
> obtain displacement U[all lines, current load case].
>
> Would that be possible?
>
>
> Why wouldn't you just allocate an array of Vecs, since you only use one at
> a time?
>
>   Thanks,
>
>     Matt
>
>
> Best regards,
>
> Antoine Côté
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <https://can01.safelinks.protection.outlook.com/?url=http:%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&data=02%7C01%7CAntoine.Cote3%40usherbrooke.ca%7C888ff916feb548abb37f08d7e7bbd0e7%7C3a5a8744593545f99423b32c3a5de082%7C0%7C0%7C637232666159880350&sdata=BrPOJDPKhyf74tJ7y%2B4q8euV3P7YwzK%2BDGZxZr1xZ10%3D&reserved=0>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200424/d857d097/attachment-0001.html>


More information about the petsc-users mailing list