1 / 50

MCP: Pitfalls & Common Mistakes

MCP: Pitfalls & Common Mistakes. Dr Jeremy Bass & RES Wind Analysis Teams (UK & USA) Senior Technical Manager AWEA Wind Resource & Project Assessment Workshop 30 September – 1 October 2009, Minneapolis, MN, USA. OVERVIEW – What Do You Need for MCP?. OVERVIEW – What Do You Need for MCP?.

wynn
Download Presentation

MCP: Pitfalls & Common Mistakes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MCP: Pitfalls & Common Mistakes Dr Jeremy Bass & RES Wind Analysis Teams (UK & USA) Senior Technical Manager AWEA Wind Resource & Project Assessment Workshop 30 September – 1 October 2009, Minneapolis, MN, USA

  2. OVERVIEW – What Do You Need for MCP?

  3. OVERVIEW – What Do You Need for MCP?

  4. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS - HARDWARE Need to avoid instrumentation issues, including: • poor mast installation • poor mounting of instruments (IEC; stub mounting) • poor choice of instruments (anemometers, vanes etc) • poor choice of data logger and/or configuration • insufficient power! • poor/lack of maintenance & record keeping INSERT IMAGE INSERT IMAGE INSERT IMAGE

  5. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS - HARDWARE Need to avoid instrumentation issues, including: • poor mast installation • poor mounting of instruments (IEC; stub mounting) • poor choice of instruments (anemometers, vanes etc) • poor choice of data logger and/or configuration • insufficient power! • poor/lack of maintenance & record keeping INSERT IMAGE INSERT IMAGE INSERT IMAGE

  6. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 1

  7. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2 7

  8. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2 8

  9. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2 9

  10. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2 10

  11. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2 11

  12. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2 12

  13. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – UNDERSTANDING (HARD)

  14. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – UNDERSTANDING (HARD)

  15. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – UNDERSTANDING (HARD)

  16. 1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – UNDERSTANDING (HARD)

  17. 2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – BACKGROUND • the fundamental principle of MCP is that site climatology, over a 20 – 25 life, is stationary, i.e. statistics consistent over time • reference site data must be consistent with this requirement – essential!

  18. 2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – BACKGROUND • the fundamental principle of MCP is that site climatology, over a 20 – 25 life, is stationary, i.e. statistics consistent over time • reference site data must be consistent with this requirement – essential!

  19. 2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – BACKGROUND • the fundamental principle of MCP is that site climatology, over a 20 – 25 life, is stationary, i.e. statistics consistent over time • reference site data must be consistent with this requirement – essential!

  20. 2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FIXED MASTS Failure to: • examine site photos/visit site • inspect site records • choose site which reflects ‘regional’ winds • choose site with similar climatology to target site • choose site with good long-term mean • choose site with long enough concurrent period available? • choose site with long enough historic period available? Last requirement can create problems…

  21. 2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FIXED MASTS

  22. 2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FIXED MASTS

  23. 2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FUTURE PROBLEM? However: • In US, ASOS masts recently re-equipped with sonic anemometers • In UK, UKMO has installed consistent instrumentation at all stations The problem: • instrument changes may destroy continuity • can result in limited number of reference sites being suitable • very sparse networks of low quality meteorological stations in many areas Outcome: often little or no suitable reference sites available!

  24. 2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FUTURE PROBLEM? However: • In US, ASOS masts recently re-equipped with sonic anemometers • In UK, UKMO has installed consistent instrumentation at all stations The problem: • instrument changes may destroy continuity • can result in limited number of reference sites being suitable • very sparse networks of low quality meteorological stations in many areas Outcome: often little or no suitable reference sites available!

  25. 2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – MESO-SCALE DATA As alternative, ‘virtual’ mast data may be appropriate: • NCEP/NCAR Re-Analysis 2 data (2.5 deg resolution; 6 hour time base) • Meso-scale data • WorldWind Atlas • Others… Don’t forget that: • must fulfill same requirements as fixed mast data! • use with caution! • last resort option!!

  26. 3. MCP – PRE-PROCESSING 1 Get to know your data: • create time series plots of target and reference site data • are time series in-phase? • do time series display the same gross trends? In practice the process of identifying a good reference site is iterative!

  27. 3. MCP – PRE-PROCESSING 1 Get to know your data: • create time series plots of target and reference site data • are time series in-phase? • do time series display the same gross trends? In practice the process of identifying a good reference site is iterative!

  28. 3. MCP – PRE-PROCESSING 1 Get to know your data: • create time series plots of target and reference site data • are time series in-phase? • do time series display the same gross trends? In practice the process of identifying a good reference site is iterative!

  29. 3. MCP – PRE-PROCESSING 1 Get to know your data: • create time series plots of target and reference site data • are time series in-phase? • do time series display the same gross trends? In practice the process of identifying a good reference site is iterative!

  30. 3. MCP – PRE-PROCESSING 2 • Create scatter plots of target and reference site data • may give insight into choice of suitable MCP algorithm BUT... • scatter plots often misleading and need a 3rd dimension (example) • generally need to ensure that correlation coefficient, r, is  0.7

  31. 3. MCP – PRE-PROCESSING 2 • Create scatter plots of target and reference site data • may give insight into choice of suitable MCP algorithm BUT... • scatter plots often misleading and need a 3rd dimension (example) • generally need to ensure that correlation coefficient, r, is  0.7

  32. 3. MCP – CHOICE OF ALGORITHM - 1 Assuming we have in-phase, suitably pre-averaged QC’d data, what MCP algorithm should we choose? • several classes of algorithm: • linear models (y = mx+c) • non-linear models • JPD-type models • neural network models • within each class, several choices available • all have strengths and weaknesses! Choice might depend on what you want to use MCP results for!

  33. 3. MCP – CHOICE OF ALGORITHM - 1 Assuming we have in-phase, suitably pre-averaged QC’d data, what MCP algorithm should we choose? • several classes of algorithm: • linear models (y = mx+c) • non-linear models • JPD-type models • neural network models • within each class, several choices available • all have strengths and weaknesses! Choice might depend on what you want to use MCP results for!

  34. 3. MCP – CHOICE OF ALGORITHM - 1 Assuming we have in-phase, suitably pre-averaged QC’d data, what MCP algorithm should we choose? • several classes of algorithm: • linear models (y = mx+c) • non-linear models • JPD-type models • neural network models • within each class, several choices available • all have strengths and weaknesses! Choice might depend on what you want to use MCP results for!

  35. 3. MCP – CHOICE OF ALGORITHM - 1 Assuming we have in-phase, suitably pre-averaged QC’d data, what MCP algorithm should we choose? • several classes of algorithm: • linear models (y = mx+c) • non-linear models • JPD-type models • neural network models • within each class, several choices available • all have strengths and weaknesses! Choice might depend on what you want to use MCP results for! From Paul van Lieshout’s ‘Wind resource analysis based on the properties of wind or “SKM Weibull’s correlation methodology evaluated”’ paper at All-Energy 2009 conference

  36. 3. MCP – CHOICE OF ALGORITHM – 2 • JPD methods, e.g. RES matrix method

  37. 3. MCP – CHOICE OF ALGORITHM – 2 • JPD methods, e.g. RES matrix method

  38. 3. MCP – CHOICE OF ALGORITHM – 2 • JPD methods, e.g. RES matrix method

  39. 3. MCP – CHOICE OF ALGORITHM – 2 • JPD methods, e.g. RES matrix method

  40. 3. MCP – DATA SUB-CATEGORISATION • typically data decomposed into sub-categories prior to applying MCP • typical sub-category is wind direction • 12 (or 16) directional sectors common • not always good choice - inspection of the wind rose may inform this • inspection of diurnal trend may help inform this choice (hourly) • if pronounced trend, consider a number of ‘time of day’ sectors • trying to capture periods with similar atmospheric stability • see Andy Oliver & Kris Zarling’s paper at AWEA 2009! Regardless of sub-categorisation, need enough data to populate all sectors

  41. 3. MCP – PREDICTION APPROACH & UNCERTAINTY ANALYSIS Long Term Estimate (LTE) Site Measurements (AAE) Historic Estimate (HE) Target Site Concurrent Period Relationship (MCP) Historic Reference Measurements Reference Site Time

  42. 3. MCP – PREDICTION APPROACH & UNCERTAINTY ANALYSIS Long Term Estimate (LTE) Site Measurements (AAE) Historic Estimate (HE) Target Site Concurrent Period Relationship (MCP) Historic Reference Measurements Reference Site Time

  43. 3. MCP – PREDICTION APPROACH & UNCERTAINTY ANALYSIS Long Term Estimate (LTE) Site Measurements (AAE) Historic Estimate (HE) Target Site Concurrent Period Relationship (MCP) Historic Reference Measurements Reference Site Time

  44. 3. MCP – SECOND STEP PREDICTIONS

  45. 3. MCP – SECOND STEP PREDICTION VS GAP FILLING

  46. 3. MCP – CONCURRENT PERIOD & SEASONALITY For most sites, the pattern of normal seasonal variation means that the precise choice of concurrent period will affect the prediction • to avoid use only full integer periods of data • not always practical! • if have less than a year of data, try to avoid extremes (e.g. summer/winter) • can generally identify from data whether significant • Probably more significant for sites with thermally, rather than pressure driven, winds? High variability initially, starting to converge after 24 months In first year, long-term predictions can be in error by ± 5 - 10 %!

  47. 3. MCP – CONCURRENT PERIOD & SEASONALITY For most sites, the pattern of normal seasonal variation means that the precise choice of concurrent period will affect the prediction • to avoid use only full integer periods of data • not always practical! • if have less than a year of data, try to avoid extremes (e.g. summer/winter) • can generally identify from data whether significant • Probably more significant for sites with thermally, rather than pressure driven, winds? Seasonally-corrected estimate is more stable and shows less spatial variability Method shows promise! High variability initially, starting to converge after 24 months In first year, long-term predictions can be in error by ± 5 - 10 %!

  48. 4. MCP – IDEAL PREDICTION STRAGEY This might feature: • a rigorous appreciation of errors/uncertainty is crucial! • the use of ‘portfolio’ MCP predictions • predictions based on multiple references sites, real and virtual • consideration of whether predictions are consistent with expectations • ‘Round Table’ discussions amongst colleagues • some iteration is inevitable!

  49. CONCLUSIONS • if approached with diligence and care, MCP can be a vital tool • it requires attention to detail at every stage of the site assessment process, not just in MCP model building (tiny part overall!) • you need to understand how to obtain: • high quality (target) site data • high quality, appropriately chosen, reference site data • you need to understand the application and limitations of MCP software • you need the skills, knowledge & experience to use it & interpret results • experiments suggest that MCP success is far more to do with choice of high quality reference site than it is to MCP algorithm!

More Related