1 / 23

Scalable Multi-Stage Stochastic Programming

Scalable Multi-Stage Stochastic Programming. Cosmin Petra and Mihai Anitescu Mathematics and Computer Science Division Argonne National Laboratory DOE Applied Mathematics Program Meeting April 3-5, 2010. Motivation . Sources of uncertainty in complex energy systems Weather

ophrah
Download Presentation

Scalable Multi-Stage Stochastic Programming

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Scalable Multi-Stage Stochastic Programming Cosmin Petra and MihaiAnitescu Mathematics and Computer Science Division Argonne National Laboratory DOE Applied Mathematics Program Meeting April 3-5, 2010

  2. Motivation • Sources of uncertainty in complex energy systems • Weather • Consumer Demand • Market prices • Applications @Argonne – Anitescu, Constantinescu, Zavala • Stochastic Unit Commitment with Wind Power Generation • Energy management of Co-generation • Economic Optimization of a Building Energy System

  3. Stochastic Unit Commitment with Wind Power • Wind Forecast – WRF(Weather Research and Forecasting) Model • Real-time grid-nested 24h simulation • 30 samples require 1h on 500 CPUs (Jazz@Argonne) Slide courtesy of V. Zavala & E. Constantinescu

  4. Economic Optimization of a Building Energy System • Proactive management - temperature forecasting & electricity prices • Minimize daily energy costs Slide courtesy of V. Zavala

  5. Optimization under Uncertainty • Two-stage stochastic programming with recourse (“here-and-now”) subj. to. continuous discrete Sample average approximation (SAA) subj. to. Sampling Inference Analysis M samples

  6. Multi-stage SAA SP Problems – Scenario formulation 0 • Depth-first traversal of the scenario tree • Nestedhalf-arrow shaped Jacobian • Block separable obj. func. 4 1 5 6 3 7 2 s.t.

  7. Linear Algebra of Primal-Dual Interior-Point Methods Convex quadratic problem IPM Linear System Min subj. to. Multi-stage SP Two-stage SP nested arrow-shaped linear system (via a permutation)

  8. The Direct Schur Complement Method (DSC) • Uses the arrow shape of H • Solving Hz=r Implicit factorization Back substitution Diagonal solve Forward substitution

  9. High performance computing with DSC • Gondzio (OOPS) 6-stages 1 billionvariables • Zavala et.al., 2007 (in IPOPT) • Our experiments (PIPS) – strong scaling is investigated • Building energy system • Almost linear scaling • Unit commitment • Relaxation solved • Largest instance • 28.9 millions variables • 1000 cores 12 x on Fusion @ Argonne

  10. Scalability of DSC Unit commitment 76.7% efficiency butnot always the case Large number of 1st stage variables: 38.6% efficiency on Fusion @ Argonne

  11. Preconditioned Schur Complement (PSC) (separate process)

  12. The Stochastic Preconditioner • The exact structure of C is • IID subset of n scenarios: • The stochastic preconditioner(Petra & Anitescu, 2010) • For C use the constraint preconditioner(Keller et. al., 2000)

  13. The “Ugly” Unit Commitment Problem • DSC on P processes vs PSC on P+1 process Optimal use of PSC – linear scaling • 120 scenarios Factorization of the preconditioner can not be hidden anymore.

  14. Quality of the Stochastic Preconditioner • “Exponentially” better preconditioning (Petra & Anitescu 2010) • Proof: Hoeffding inequality • Assumptions on the problem’s random data • Boundedness • Uniform full rank of and not restrictive

  15. Quality of the Constraint Preconditioner • has an eigenvalue 1 with order of multiplicity . • The rest of the eigenvalues satisfy • Proof: based on Bergamaschiet. al., 2004.

  16. The Krylov Methods Used for • BiCGStab using constraint preconditioner M • Preconditioned Projected CG (PPCG) (Gould et. al., 2001) • Preconditioned projection onto the • Does not compute the basis for Instead,

  17. Performance of the preconditioner • Eigenvalues clustering & Krylov iterations • Affected by the well-known ill-conditioning of IPMs.

  18. A Parallel Interior-Point Solver for Stochastic Programming (PIPS) • Convex QP SAA SP problems • Input: users specify the scenario tree • Object-oriented design based on OOQP • Linear algebra: tree vectors, tree matrices, tree linear systems • Scenario based parallelism • tree nodes (scenarios) are distributed across processors • inter-process communication based on MPI • dynamic load balancing • Mehrotra predictor-corrector IPM

  19. Tree Linear Algebra – Data, Operations & Linear Systems Min • Data • Tree vector: b, c, x, etc • Tree symmetric matrix: Q • Tree general matrix: A • Operations • Linear systems: for each non-leaf node a two-stage problem is solved via Schur complement methods as previously described. subj. to. 0 4 1 5 6 3 7 2

  20. Parallelization – Tree Distribution • The tree is distributed across processes. • Example: 3 processes • Dynamic load balancing of the tree • Number partitioning problem --> graph partitioning --> METIS 0 0 0 1 4 4 7 6 5 2 3

  21. Conclusions • The DSC method offers a good parallelism in an IPM framework. • The PSC method improves the scalability. • PIPS – solver for SP problems. • PIPS is ready for larger problems: 100,000 cores.

  22. Future work • New math / stat • Asynchronous optimization • SAA error estimate • New scalable methods for a more efficient software • Better interconnect between (iterative) linear algebra and sampling • importance-based preconditioning • multigrid decomposition • Target: emerging exaarchitectures • PIPS • IPM hot-start, parallelization of the nodes • Ensure compatibility with other paradigms: NLP, conic progr., MILP/MINLP solvers • A ton of other small enhancements • Ensure computing needs for important applications • Unit commitment with transmission constraints & market integration (Zavala)

  23. Thank you for your attention! Questions?

More Related