[petsc-users] Extremely slow DMNetwork Jacobian assembly

Abhyankar, Shrirang G shrirang.abhyankar at pnnl.gov
Wed May 8 10:30:23 CDT 2019



From: petsc-users <petsc-users-bounces at mcs.anl.gov> on behalf of Matthew Knepley via petsc-users <petsc-users at mcs.anl.gov>
Reply-To: Matthew Knepley <knepley at gmail.com>
Date: Wednesday, May 8, 2019 at 7:46 AM
To: Justin Chang <jychang48 at gmail.com>
Cc: petsc-users <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Extremely slow DMNetwork Jacobian assembly

On Wed, May 8, 2019 at 4:45 AM Justin Chang via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Hi guys,

I have a fully working distribution system solver written using DMNetwork, The idea is that each electrical bus can have up to three phase nodes, and each phase node has two unknowns: voltage magnitude and angle. In a completely balanced system, each bus has three nodes, but in an unbalanced system some of the buses can be either single phase or two-phase.

The working DMNetwork code I developed, loosely based on the SNES network/power.c, essentially represents each vertex as a bus. DMNetworkAddNumVariables() function will add either 2, 4, or 6 unknowns to each vertex. If every single bus had the same number of variables, the mat block size = 2, 4, or 6, and my code is both fast and scalable. However, if the unknowns per DMNetwork vertex unknowns are not the same across, then my SNESFormJacobian function becomes extremely extremely slow. Specifically, the MatSetValues() calls when the col/row global indices contain an offset value that points to a neighboring bus vertex.

I have never seen MatSetValues() be slow unless it is allocating. Did you confirm that you are not allocating, with -info?

  Thanks,

     Matt

I have written power grid codes using DMNetwork where the vertex dofs range from 2 to 20. I have not yet observed the slow-down you report. My guess, as Matt points, is something to do with the preallocation.

In power.c example, the DM creates the Jacobian matrix (which sets the Jacobian nonzero structure and does the allocation). Do you have the following lines in your code?

ierr = DMCreateMatrix(networkdm,&J);CHKERRQ(ierr);
ierr = MatSetOption(J,MAT_NEW_NONZERO_ALLOCATION_ERR,PETSC_FALSE);CHKERRQ(ierr);


Why is that? Is it because I no longer have a uniform block structure and lose the speed/optimization benefits of iterating through an AIJ matrix? I see three potential workarounds:

1) Treat every vertex as a three phase bus and "zero out" all the unused phase node dofs and put a 1 in the diagonal. The problem I see with this is that I will have unnecessary degrees of freedom (aka non-zeros in the matrix). From the distribution systems I've seen, it's possible that  anywhere from 1/3 to 1/2 of the buses will be two-phase or less, meaning I may have nearly twice the amount of dofs than necessary if I wanted to preserve the block size = 6 for the AU mat.

2) Treat every phase node as a vertex aka solve a single-phase power flow solver. That way I guarantee to have a block size = 2, this is what Domenico's former student did in his thesis work. The problem I see with this is that I have a larger graph, which can take more time to setup and parallelize.

3) Create a "fieldsplit" where I essentially have three "blocks" - one for buses with all three phases, another for buses with only two phases, one for single-phase buses. This way each block/fieldsplit will have a consistent block size. I am not sure if this will solve the MatSetValues() issues, but it's, but can anyone give pointers on how to go about achieving this?


Thanks,
Justin


--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190508/492b1dcf/attachment.html>


More information about the petsc-users mailing list