RATC/ARPA-E Quarterly Review Meeting - PowerPoint PPT Presentation

ratc arpa e quarterly review meeting n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
RATC/ARPA-E Quarterly Review Meeting PowerPoint Presentation
Download Presentation
RATC/ARPA-E Quarterly Review Meeting

play fullscreen
1 / 134
RATC/ARPA-E Quarterly Review Meeting
133 Views
Download Presentation
mitch
Download Presentation

RATC/ARPA-E Quarterly Review Meeting

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. RATC/ARPA-E Quarterly Review Meeting Dr. Mladen Kezunovic - PI

  2. Meeting Agenda

  3. Major Meetings March - July 2012

  4. Major Meetings August – September 2012

  5. Major Meetings October 2012 – January 2013

  6. Major Meetings January – April 2013

  7. Major Meetings May – August 2013 June Deliverable Review

  8. Year 1 Deliverable Review Research Demonstration Data Delivery Communication Data Layer Technology to Market

  9. Year 1 Accomplishments

  10. Ongoing Support Activities 03/31/13 08/31/13 09/30/14

  11. Meeting Agenda

  12. RATC Architecture Architecture Effort Presented by Tomo Popovic Sep 4, 2013

  13. Activity Overview 2013 Q3 (Project Q5) • Test plans and test cases • Test data consolidation • Simulation • Field data from TVA (PJM) • High-level scenarios • Solution demo • T2M effort support

  14. Test Plan Document Progress: • template • first draft • ORNL review • second draft • Version 1.0 releasedAug 2013

  15. Test Plan Purpose June Deliverable Review

  16. Test Data Progress: • test plans reference required test data • correspondence with TVA/ORNL • analyzing received data • correspondence with IED vendors • obtaining access to PJM data

  17. Test Data June Deliverable Review

  18. High-level Scenarios Scenarios: Market-based Cascade event Unplanned event (loss of generation, malicious attack)

  19. Market-based June Deliverable Review

  20. Cascade Event June Deliverable Review

  21. Unplanned Event Mitigation June Deliverable Review

  22. Solution Demo Progress: • effort led/initiated by Popovic/Sprintson • Internal, initial focus on TEES teams • Identifying/selecting test cases/data (IEEE 118) • prototyping data layer using Java and Apache Camel • Possible later integration with training solutions from IncSys (Dr. Podmore)

  23. Solution Demo June Deliverable Review

  24. T2M Effort Support Activities: Several meetings with Dr. Schneider T2M action plan development

  25. Meeting Agenda

  26. Robust Adaptive Topology Control Algorithm Group Presentations: Arizona State – Kory Hedman (10 min) UC Berkeley – Shmuel Oren (10 min) Texas A&M – Erick Moreno Centeno (5 min) and Garng Huang (5 min)

  27. Milestones Overview

  28. Robust Adaptive Topology Control Section 1:Kory Hedman Arizona State UniversityARPA-E ReviewSeptember 4, 2013

  29. Update on Parallelization Results of the Greedy Algorithm • Simulate G-2contingencies on the IEEE 118 bus test case • RATC corrective application (post-contingency line switching)

  30. Parallel versus Sequential Greedy Algorithm

  31. Parallelization of Greedy Algorithm The performance of the algorithm was tested with 2, 4, 6, 8, 10, 12 nodes by assigning 16 tasks per node

  32. RATC – End-to-end Process This research is jointly funded by PSERC and ARPA-e

  33. Robust Wind Corridor With topology control, the maximum wind uncertainty that the system can handle is increased by 100% This research is jointly funded by PSERC and ARPA-e

  34. Task 1.12: RATC Emergency: Real-time corrective 1 Not all test cases produced 2x faster solution times; such test cases were very easy to solve, thereby not requiring the greedy algorithm for a speedup factor

  35. Task 2.1: RATC Emergency: Cascading event scenarios

  36. Robust Adaptive Topology Control Section 2:Shmuel OrenThe University of California, BerkeleyARPA-E ReviewSeptember 4, 2013

  37. First Year Deliverables and Progress Report UC Berkeley Modeling Wind Variability and UncertaintySmall Scale Testing: NormalTransmission switching with successive linearization of AC OPF

  38. Obtain a multi-area stochastic model of wind power that capture temporal and spatial statistical and systematic variability of wind power production • Calibrate model to available data bases (e.g. NREL western wind and solar interconnection study) of wind speed and power production. • Use model to bootstrap limited data so as to produce multiple temporal and spatial realization of wind power scenarios that can be used for stochastic optimization and simulation studies (scenarios can be scaled to reflect renewable penetration assumptions) Wind Modeling: Objectives and Approach

  39. Wind Speed Profiles by Season Fall Spring Summer Winter

  40. Calibration Methodology • Remove seasonal and diurnal effects: Let be the wind speed measurements taken at site and , be the mean and standard deviación of thewindspeedmeassured at time , in locationduring theseason m. • Estimate marginal distributions for the data . • Normalizethehistorical time series: Define • (transformtoGaussian) • Fit AR(P) model: Per eachlocationfindthecoefficients of • (Yule Walker equations) • Fit a joint distribution model to the white noise , where

  41. Comparison • We compare the empirical distribution to four time series models: • No spatial correlation (NSC). • Multivariate Normal spatial correlation (MN). • Gaussian Copula spatial correlation(GC). • Student t Copula spatial correlation(TC). • We compare the goodness of fit per location and the goodness of fit of the sum across all locations.

  42. Example: Tehachapi County

  43. Differences in Cumulative Dist. Error: NSP MN Tehachapi MW Table 1: Summation of Differences per Location and Model

  44. Diff. Between Estimated C.D.F. of Summation NSP MN MW Table 2: Summation of Differences per Model

  45. Test Cases for RATC: Normal State • IEEE 118 Bus System (Standard test case for transmission switching studies (Fisher-2008, Hedman -2010, Fuller-2012) • 118 buses • 186 lines • 19 generators • Solved on Laptop with 3 processors • FERC Dataset • 13867 Buses • 1011 Generators • 18824 Branches • Summer Scenario • DCOPF Base Case (all lines closed) Cost: $541620.75 • Solved on LLNL HPC Cluster • Computational Facility - LLNL Hera cluster • 864 nodes (13824 cores) • Quad-Core AMD Opteron processors 8356 at 1.2 GHz, 32 GB per node • MPI calling on Java CPLEX callable library

  46. Results for IEEE118 Case TX1- Greedy, TX2- Greedy with presort & batch, TX3 – batched MIP • (TX3) outperforms (TX2) which outperforms (TX1) TX3 exceeds Go/No Go Performance Metric • Note (TX2) outperforms (TX1) although (TX1) checks all lines at each iteration • Critical lines: • L132, L153 switched by all • and L132, L162 switched by (TX1) and (TX3) • FULL MIP Solution: Cost = 1537.38 (MIP GAP 0.5%) 32 Lines switched (timed out after 5 minutes when MIP GAP was set to 10^-7) (<2% MIPGAP) 17 seconds 300 seconds 1,314 seconds FULL MIP (0.5% MIPGAP) 32 Lines OFF 34 seconds 1537.38

  47. Preliminary Results for PJM/FERC Case

  48. Conclusions • The model generally takes very long time to solve even with parallel programming. We are exploring CPLEX parameter settings, improvements through use of warm starts and other shortcut methods used in industry. • Even though TX1 outperforms TX2, the long computation time renders it impractical, which makes intelligent line search by sensitivity analysis important in real electrical networks. • Using smaller K can effectively reduce the computation time in TX2.

  49. LP-ACOPF Overview Set Initial Fixed Points AC Feasible or at Iteration Limit? Solve LP Approximation Yes Return Solution No Add voltage and current cuts Tighten Stepsize Restrictions Reset fixed points to LP optimal solutions

  50. r r Linearizing Voltage & Current Constraints Initial Approximation Iterative Cuts Voltage Current Actual Feasible Region vmin vmax New Constraint imax Approximated Feasible Region Previous Solution