1 / 65

Webinar May 22, 2012

CSI 2010 Impact Evaluation Addendum. Webinar May 22, 2012. Introduction. Scope: Additional analyses to the 2010 CSI Impacts Report Not meant to stand alone Includes SGIP and CSI systems

ishana
Download Presentation

Webinar May 22, 2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSI 2010 Impact Evaluation Addendum WebinarMay 22, 2012

  2. Introduction • Scope: Additional analyses to the 2010 CSI Impacts Report • Not meant to stand alone • Includes SGIP and CSI systems • Section 5: PV Performance over Time. This section quantifies the effects of ownership, incentive type, and module material on PV performance over time using two different methods. This section also provides estimates of PV degradation for each of those groupings. • Section 7: Analysis of Interval Billing Data. This section uses interval billing data for CSI customers in PG&E territory to expand and revisit the interval billing analysis previously included in Section 7.2 of the 2010 CSI Impact Evaluation.

  3. Logistics • Everyone but the presenters will be muted • If you have a question please type it in at the question box; will address these appropriately at the end of each section • At the end of the presentation there will be a brief Q&A if time allows • A recording of this webinar and the slides will be available afterwards

  4. Terminology • EPBB – Estimated Performance Buy Back; CSI up front incentive based on estimated system performance • PBI – Performance Based Incentive; Payment per kWh produced for five years for CSI systems • SGIP – Self Generation Incentive Program; incentives were based on installed capacity • TPO – Third Party Owner; can be either lease or PPA

  5. PV Performance Over Time - Degradation

  6. PV Performance Over Time • Objective is quantify the effects of • Ownership, • Incentive type, and • Material type on PV system performance over time • Builds on trend analysis in the 2010 Impacts Report with more sophisticated and in-depth analysis

  7. What is Degradation • System Degradation: refers tooverall change in system performance over time. This definition explicitly includes all factors that may lead to reduction in system performance over time. The definition includes the traditional ‘degradation’ that is largely due to module degradation but adds such factors as • Soiling, • Maintenance, • System availability, • Fire, theft, etc. • Equipment Degradation: refers to change in system performance with minimization of the affects of soiling and system outages. This is intended to be much closer to the definition of ‘PV degradation’ found in many other studies.

  8. An Extreme Case of Soiling • There are many factors that affect system performance beyond module degradation

  9. Methods • Day Substitution for System Degradation. Weather normalization is accomplished by substituting sunny and cloudy days such that solar resource variability is removed and annual percentage changes in system performance can be calculated. • Linear Regression • System Degradation. Weather normalization is accomplished by using monthly irradiance and temperature as independent variables in the regression analysis. We use the regression model to estimate the effects of ownership, incentive, and PV technology on degradation. • Equipment Degradation. In this analysis we further filter the data used by the system degradation model to minimize the affects of soiling and system availability.

  10. Data Preparation & Merging • PV Performance Data – Processed metered data • Panel & System characteristics –PowerClerk (CSI) and SGIP tracking data • Exclude tracking systems • Plane of Array (POA) Irradiance from • Solar Anywhere (Clean Power Research) Global Horizontal & Direct Normal estimated from solar geometry & cloud cover • Panel azimuth & tilt and Perez sky model • Rainfall and Temperature from CIMIS • Only used for regression

  11. Day substitution

  12. Steps in Day Substitution • Develop Statewide Monthly Reference Solar Resource • Weather Normalization via Day Substitution • Calculate Performance Changes • Estimate accuracy of performance change results

  13. Reference Solar Resource - Statewide • Median of all actual values for individual systems

  14. Weather Normalization via Day Substitution • Example of actual May 2009 data for a CCSE PV system

  15. Example Site-Specific System Degradation Rates • Average Year1 to Year 2 change: -2.8% • Many site-specific results deviate substantially from average

  16. Example Site-Specific DataJuly Daily CF for Year 1 & Year 2 • Site-specific change: -39%

  17. Example Site-Specific DataJuly Daily CF for Year 2 • Apparent temporary outage reduced CF approximately 50%

  18. Average Annual System Degradation Rates using Day Substitution • Difference from zero significant for 2 groups • EPBB-TPO & SGIP-Host • Difference between groups significant in 2 cases • EPBB-TPO degrade faster than EPBB-Host • EPBB-TPO degrade faster than PBI-TPO

  19. Key Findings • Many site-specific results deviate substantially from averages • Substantial quantities of data are required to produce accurate estimates of important differences • Example: 90% confidence that EPBB-Host degradation rate is 0.X% (±10%) faster than EPBB-TPO degradation rate • Caution is needed when comparing these averages to other references that are not subject to the same types of confounding factors • Module warranties • Controlled studies • Results of Day Substitution and Regression analyses are aligned • Differences between the results are not significantly different when several years of data are available

  20. Linear Regression

  21. Data Filtering • System Degradation • POA 700 – 1300 W/m2 • 0 ≤CFInterval ≤ 1.15 • Explicitly includes outages and potentially soiled panels • Equipment Degradation • POA 700 – 1300 W/m2 • 0< CFInterval ≤ 1.15; total outages excluded • 7 Days or less since 0.2” of rain in a day • Intent is to minimize effects of outages (system availability) and soiling

  22. Data By Incentive Type and Age • Majority of CSI data is for systems less than two years old • 3 years is often held as the recommended bare minimum of data for degradation studies

  23. Monthly Mean Capacity Factors During High Sun • Only metered systems with 12 months or more of data • Age based on time from estimated date of operation

  24. Regression Model • Due to the presence of serial correlation, our analysis used an autoregressive error model in the AUTOREG procedure in SAS software to correct for the correlated errors. • Dependent variable is monthly capacity factor • Independent variables are the parameters of interest or expected influence • CFmonthly = a1+a2+……a38

  25. Regression Parameters

  26. Parameter Estimates from System Level Regression (these are monthly) • Before Autocorrelation Adjustment, Durbin-Watson = 0.56 • Post Autocorrelation Adjustment • Durbin-Watson =1.996 • R2 = 0.65 • ‘Base’ is SGIP, Host Owned, Polycrystalline • PBI and TPO have positive affects

  27. Calculating Annual Change from Parameter Estimates • Model has a mix of incentive types, ownership types, and module technologies • Annual rates of change for each grouping is a mix of incentive, ownership, and module technology age effects for example; • EPPB participants that are host owned with thin film technology; • DegRate EPBB-Host-ThinFilm = 0.000637 * (EPPBB = 1) + -0.001443 * (ThinFilmPVTech = 1) Or: DegRate EPBB-Host-ThinFilm = a33 + a28 • EPBB Host degradation rates are a mix of different technologies age affects, so EPBB Host rate is weighted by the proportion of each module type

  28. Average Annual System Degradation Rates using Linear Regression • Statistically Significant Differences; • PBI > EPBB > SGIP • TPO has a positive effect • Crystalline modules appear more stable than thin film or hybrid

  29. Data Availability after filtering for Equipment Degradation • Panels likely to see soiling late summer/early fall • Including only ‘clean’ panels; 7 Days or less since 0.2” of rain in a day • Excluding full outages

  30. Average Annual Equipment Degradation Rates using Linear Regression • EPBB now appears better than PBI (not significantly) • TPO still has a significant positive effect • Hybrid modules appear to perform worse than others using these filters

  31. System Degradation Method Comparison • Methods don’t necessarily agree but they don’t disagree

  32. Key Findings • Overall rates vary for the regression approach from • +0.71% (for PBI TPO systemsThird Party Ownership is beneficial for long term performance • -2.12% for SGIP host owned systems • These are somewhat higher than usually referenced as degradation (~0.%~1%) but since they encompass soiling & availability that is to be expected • Extrapolation beyond the time frame of available data could be troublesome • Third party ownership appears to slow degradation in a statistically significant way for this sample • Incentive Results at the system level; • PBI systems showed minimal degradation (~0.03% improvement in fact) • EPBB systems showed a change of ~-1 %/year • SGIP systems showed a change of ~-2%/year • Crystalline modules appear to degrade more slowly over time then thin film or hybrid • May be highly influenced by first year non linear drop for thin films • Future analysis should allow a much finer and accurate assessment and care should be taken when extrapolating beyond the period of available data. Additionally, future years should allow the EPBB sample to be drawn from a truly random sample.

  33. CSI 2010 Impact Evaluation Addendum – Interval Billing Data

  34. Interval Data Analysis • Hourly profile for a bill with net exports:

  35. Interval Data Analysis, cont. • Hourly profile for a bill without net export:

  36. Value of Interval Data • On their own, interval data can show: • Likelihood of export • Grid usage • Amount PV export • Combined with generation data, interval data can also show: • PV usage • Share of PV exported • PV as share of total usage

  37. Interval Data Sites Interval and PV Generation Data Interval Data Only

  38. Residential Export Probability

  39. Commercial Export Probability

  40. Government/Non-Profit Export Probability

  41. SCE Residential Export Probability by Export Group

  42. CCSE Residential Export Probability by Export Group

  43. SCE Commercial Export Probability by Export Group

  44. CCSE Commercial Export Probability by Export Group

  45. Comparison of Residential Average Hourly Grid kWh

  46. Comparison of Commercial Average Hourly Grid kWh

  47. Comparison of Gov./Non-Profit Average Hourly Grid kWh

  48. Residential Export Magnitude

  49. Commercial Export Magnitude

  50. Government/Non-Profit Export Magnitude

More Related