[petsc-users] [EXT]Re: Is using PETSc appropriate for this problem

Alexander B Prescott alexprescott at email.arizona.edu
Wed Sep 16 22:28:08 CDT 2020


Hi Barry, thank you for the thoughtful response, I've answered
your questions below.

Best,
Alexander

On Wed, Sep 16, 2020 at 6:36 PM Barry Smith <bsmith at petsc.dev> wrote:

> *External Email*
>
>   Alexander,
>
>    A few background questions.
>
>      Do the small solves need to be done sequentially, that is is the
> input of one needed by the next or can many solves be done "at the same
> time".
>
Sequentially


>      Would you be using Newton's method with an analytic Jacobian?
>
Yes


>      For the larger problems 9 unknowns is there a consistent sparsity of
> the Jacobian (say 20 to 30 nonzeros) or are they essentially dense?
>
Dense


>      Are the problems of varying nonlinearity, that is will some converge
> with say a couple of Newton iterations while others require more, say 8 or
> more Newton steps?
>
The nonlinearity should be pretty similar, the problem setup is the same at
every node but the global domain needs to be traversed in a specific order.



>
> On Sep 16, 2020, at 6:09 PM, Alexander B Prescott <
> alexprescott at email.arizona.edu> wrote:
>
> Hello PETSc listserv,
>
> This is an inquiry about code structure and the appropriateness of using
> SNES for a specific problem. I've found PETSc powerful and quite useful for
> my other problems, but for this application I'm concerned about
> computational overhead. Our setup involves many thousands of independent
> calls to the nonlinear solver on small subproblems, i.e. 2<=d.o.f.<=9.
> Speed of execution is the primary concern. Now straight to my questions:
>
>    - does it even make sense to use PETSc for a problem like this? Would
>    it be like using a nuclear reactor to warm a quesadilla?
>
>
>     There is a good deal of overhead for that small a problem size, but
> much of the overhead is in the initial construction of the PETSc objects,
> once they are created the extra overhead may be acceptable. There are
> plenty of tricks to bring down the extra overhead by avoiding the more
> expensive functions that are beneficial for larger problems but just add
> overhead for small problems, such as the calls to BLAS (and calls to more
> expensive linear solvers). The most extreme is to remove the use of the
> virtual functions and essentially inline everything, some of this might be
> automatable.
>
>
>    - if it does make sense, is it better to create/destroy the SNES
>    structures with each new subproblem, OR to create the structures once and
>    modify them every time?
>
>     You would definitely benefit from creating a SNES for each size 2 to 9
> and reusing that one for all those of the same size.
>
>      If you have hundreds of thousands that can be done simultaneously
> (but independently) then GPUs could perform extremely well.
>
>
> Best,
> Alexander
>
> --
> Alexander Prescott
> alexprescott at email.arizona.edu
> PhD Candidate, The University of Arizona
> Department of Geosciences
> 1040 E. 4th Street
> Tucson, AZ, 85721
>
>
>

-- 
Alexander Prescott
alexprescott at email.arizona.edu
PhD Candidate, The University of Arizona
Department of Geosciences
1040 E. 4th Street
Tucson, AZ, 85721
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200916/258a32da/attachment.html>


More information about the petsc-users mailing list