[petsc-dev] PDIPDM questions

Matthew Knepley knepley at gmail.com
Tue Sep 15 14:25:40 CDT 2020


On Tue, Sep 15, 2020 at 3:05 PM Pierre Jolivet <pierre.jolivet at enseeiht.fr>
wrote:

> Thank you Barry for the very comprehensive answer, this gives me hope that
> I’ll indeed be able to help our Ipopt users transition to Tao when this is
> in place/if this gets done!
> Shri, my Hessian and my x are currently distributed (really standard
> distribution, nothing fancy) using the same layout.
> I _need_ to centralize the Jacobian on rank 0, because there is currently
> no way to distribute a matrix with a single row on more than one rank.
> So in this scenario, PDIPM is indeed not usable, cf. the previous trace.
> When you say: "The column layout of the equality/inequality Jacobian is
> same as that for x”, this is currently not achievable with a distributed x
> and a single equality/inequality Jacobian.
> The two fixes I see:
> 1) provide the transpose of the Jacobian (of dimension N x 1, so can be
> split row-wise no problem, but this may have side effect I’m not
> anticipating)
>

This sounds like the easiest and most natural thing.

  Thanks,

      Matt


> 2) provide the Jacobian as a Mat_MPIColumn (for lack of a better name, to
> quote Barry)
>
> Thanks,
> Pierre
>
> On 15 Sep 2020, at 8:18 PM, Barry Smith <bsmith at petsc.dev> wrote:
>
>
>   Pierre,
>
>     Based on my previous mail I am hoping that the PDIPM algorithm itself
> won't need a major refactorization to be scalable, only a custom matrix
> type is needed to store and compute with the  Hessian in a scalable way.
>
>    Barry
>
>
> On Sep 15, 2020, at 12:50 PM, Pierre Jolivet <pierre.jolivet at enseeiht.fr>
> wrote:
>
>
>
> On 15 Sep 2020, at 5:40 PM, Abhyankar, Shrirang G <
> shrirang.abhyankar at pnnl.gov> wrote:
>
> Pierre,
>    You are right. There are a few MatMultTransposeAdd that may need
> conforming layouts for the equality/inequality constraint vectors and
> equality/inequality constraint Jacobian matrices. I need to check if that’s
> the case. We only have ex1 example currently, we need to add more examples.
> We are currently working on making PDIPM robust and while doing it will
> work on adding another example.
>
> Very naive question, but given that I have a single constraint, how do I
> split a 1 x N matrix column-wise? I thought it was not possible.
>
> When setting the size of the constraint vector, you need to set the local
> size on one rank to 1 and all others to zero. For the Jacobian, the local
> row size on that rank will be 1 and all others to zero. The column layout
> for the Jacobian should follow the layout for vector x. So each rank will
> set the local column size of the Jacobian to local size of x.
>
>
> That is assuming I don’t want x to follow the distribution of the Hessian,
> which is not my case.
> Is there some plan to make PDIPM handle different layouts?
> I hope I’m not the only one thinking that having a centralized Hessian
> when there is a single constraint is not scalable?
>
> Thanks,
> Pierre
>
> Shri
>
>
> On 15 Sep 2020, at 2:21 AM, Abhyankar, Shrirang G <
> shrirang.abhyankar at pnnl.gov> wrote:
>
> Hello Pierre,
>    PDIPM works in parallel so you can have distributed Hessian, Jacobians,
> constraints, variables, gradients in any layout you want.  If you are using
> a DM then you can have it generate the Hessian.
>
>
> Could you please show an example where this is the case?
> pdipm->x, which I’m assuming is a working vector, is both used as input
> for Hessian and Jacobian functions, e.g.,
> https://gitlab.com/petsc/petsc/-/blob/master/src/tao/constrained/impls/ipm/pdipm.c#L369 (Hessian)
> +
> https://gitlab.com/petsc/petsc/-/blob/master/src/tao/constrained/impls/ipm/pdipm.c#L473
>  (Jacobian)
> I thus doubt that it is possible to have different layouts?
> In practice, I end up with the following error when I try this (2
> processes, distributed Hessian with centralized Jacobian):
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Nonconforming object sizes
> [1]PETSC ERROR: Vector wrong size 14172 for scatter 0 (scatter reverse and
> vector to != ctx from size)
> [1]PETSC ERROR: #1 VecScatterBegin() line 96 in
> /Users/jolivet/Documents/repositories/petsc/src/vec/vscat/interface/vscatfce.c
> [1]PETSC ERROR: #2 MatMultTransposeAdd_MPIAIJ() line 1223 in
> /Users/jolivet/Documents/repositories/petsc/src/mat/impls/aij/mpi/mpiaij.c
> [1]PETSC ERROR: #3 MatMultTransposeAdd() line 2648 in
> /Users/jolivet/Documents/repositories/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: Nonconforming object sizes
> [0]PETSC ERROR: Vector wrong size 13790 for scatter 27962 (scatter reverse
> and vector to != ctx from size)
> [1]PETSC ERROR: #4 TaoSNESFunction_PDIPM() line 510 in
> /Users/jolivet/Documents/repositories/petsc/src/tao/constrained/impls/ipm/pdipm.c
> [0]PETSC ERROR: #5 TaoSolve_PDIPM() line 712 in
> /Users/jolivet/Documents/repositories/petsc/src/tao/constrained/impls/ipm/pdipm.c
> [1]PETSC ERROR: #6 TaoSolve() line 222 in
> /Users/jolivet/Documents/repositories/petsc/src/tao/interface/taosolver.c
> [0]PETSC ERROR: #1 VecScatterBegin() line 96 in
> /Users/jolivet/Documents/repositories/petsc/src/vec/vscat/interface/vscatfce.c
> [0]PETSC ERROR: #2 MatMultTransposeAdd_MPIAIJ() line 1223 in
> /Users/jolivet/Documents/repositories/petsc/src/mat/impls/aij/mpi/mpiaij.c
> [0]PETSC ERROR: #3 MatMultTransposeAdd() line 2648 in
> /Users/jolivet/Documents/repositories/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: #4 TaoSNESFunction_PDIPM() line 510 in
> /Users/jolivet/Documents/repositories/petsc/src/tao/constrained/impls/ipm/pdipm.c
> [0]PETSC ERROR: #5 TaoSolve_PDIPM() line 712 in
> /Users/jolivet/Documents/repositories/petsc/src/tao/constrained/impls/ipm/pdipm.c
> [0]PETSC ERROR: #6 TaoSolve() line 222 in
> /Users/jolivet/Documents/repositories/petsc/src/tao/interface/taosolver.c
>
> I think this can be reproduced by ex1.c by just distributing the Hessian
> instead of having it centralized on rank 0.
>
>
> Ideally, you want to have the layout below to minimize movement of
> matrix/vector elements across ranks.
>
> ·         The layout of vectors x, bounds on x, and gradient is same.
>
> ·         The row layout of the equality/inequality Jacobian is same as
> the equality/inequality constraint vector layout.
>
> ·         The column layout of the equality/inequality Jacobian is same
> as that for x.
>
>
> Very naive question, but given that I have a single constraint, how do I
> split a 1 x N matrix column-wise? I thought it was not possible.
>
> Thanks,
> Pierre
>
>
> ·         The row and column layout for the Hessian is same as x.
>
> The tutorial example ex1 is extremely small (only 2 variables) so its
> implementation is very simplistic. I think, in parallel, it ships off
> constraints etc. to rank 0. It’s not an ideal example w.r.t demonstrating a
> parallel implementation. We aim to add more examples as we develop PDIPM.
> If you have an example to contribute then we would most welcome it and
> provide help on adding it.
>
> Thanks,
> Shri
> *From: *petsc-dev <petsc-dev-bounces at mcs.anl.gov> on behalf of Pierre
> Jolivet <pierre.jolivet at enseeiht.fr>
> *Date: *Monday, September 14, 2020 at 1:52 PM
> *To: *PETSc Development <petsc-dev at mcs.anl.gov>
> *Subject: *[petsc-dev] PDIPDM questions
>
> Hello,
> In my quest to help users migrate from Ipopt to Tao, I’ve a new question.
> When looking at src/tao/constrained/tutorials/ex1.c, it seems that almost
> everything is centralized on rank 0 (local sizes are 0 but on rank 0).
> I’d like to have my Hessian distributed more naturally, as in (almost?)
> all other SNES/TS examples, but still keep the Jacobian of my equality
> constraint, which is of dimension 1 x N (N >> 1), centralized on rank 0.
> Is this possible?
> If not, is it possible to supply the transpose of the Jacobian, of
> dimension N x 1, which could then be distributed row-wise like the Hessian?
> Or maybe use some trick to distribute a MatAIJ/MatDense of dimension 1 x N
> column-wise? Use a MatNest with as many blocks as processes?
>
> So, just to sum up, how can I have a distributed Hessian with a Jacobian
> with a single row?
>
> Thanks in advance for your help,
> Pierre
>
>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20200915/605ae008/attachment.html>


More information about the petsc-dev mailing list