1 / 19

GEOSS Evaluation Task Analysis

GEOSS Evaluation Task Analysis. John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010. The high level analysis of all Overarching Tasks. Task/target matching. Task/Target Matching. What we have Lars’ analysis of 10 of 14 strategic target areas

demi
Download Presentation

GEOSS Evaluation Task Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

  2. The high level analysis of all Overarching Tasks Task/target matching

  3. Task/Target Matching • What we have • Lars’ analysis of 10 of 14 strategic target areas • MLQ’s analysis of 3 strategic target areas • Coverage of 11 of 14 areas • Two somewhat different approaches. • Previous discussion that this is not satisfactory • No actionable suggestions for improvement

  4. Task/Target Matching • Investigated alternative method • Automated Analysis • Text comparison tool scores similarity of two documents • Produces a quantified index of overlap • Results comparable between Transverse and Strategic Target Areas • Unsatisfactory • Identified available tools only discover word-for-word matches, limited accounting for synonyms • No accounting for context or meaning. • Could be viable if a team member has access to and experience with natural language analysis software. (http://plagiarism.phys.virginia.edu/Wsoftware.html)

  5. Task/Target Matching • Proposal • Lars completes his initial analysis. • If desired, 1 or 2 more team members replicate the analysis. • The replicators and Lars produce a set of consensus sheets. • Summarize findings in written form (answer the Evaluation Framework Questions); this is incorporated into the report.

  6. Task/Target Matching Questions? Discussion? Suggestions? Agreement?

  7. Proposal for selection and analysis of a subset of GEOSS implementation activities. In-depth Task analysis

  8. Task Selection • Need: A list of tasks for in-depth analysis • Identified before conducting secretariat interviews • Covers all 14 SBAs and Transverse Areas • Representative of breadth of GEOSS implementation • Yields reasonable analysis work load for team • Based on: a variety of inputs • Lars’ and MLQ’s partial review, progress reports, Cape Town, diversity of GEOSS activity

  9. Task Selection • Proposal • Select 14 tasks (1 from each target area) based on partial analysis • Narrow that list to 1 subtask from each of the tasks. • Focus on “primary” component of the task or otherwise specifically identified sub-task. This maintains the coverage across all of the target areas but cuts out the variation in size of Overarching Tasks and yields a more tractable workload.

  10. Task Selection Architecture: AR-09-04a GEONETCast This task is selected for in-depth analysis because it is global in scope and ties to multiple points in the Cape Town priorities. The Geo Portal/GCI activity was not chosen because we are aware of ongoing evaluations of that activity by the implementers and because that activity indirectly encompasses the entirety of GEOSS and is perhaps more suited to a targeted in-depth evaluation of the Architecture Transverse Area. Data Management: DA-06-01 GEOSS Data Sharing Principles This task was identified by Michel L as a task for in-depth evaluation. The data sharing principles are one of the few tasks specifically mentioned in the Cape Town priorities and therefore a definite inclusion in our analysis. The principles, being a concept rather than a digital/physical infrastructure, are more conducive to the scope and capabilities of this review. Capacity Building: CB-09-05a Open Source Software This is a sub-task of the infrastructure capacity building task. It is selected because it is directed toward improving end-user capacity to utilize data; it is also led outside of the US and Europe.

  11. Task Selection Science and Technology: ST-09-02 Promoting Awareness and Benefits of GEO in the Science and Technology Community. Recommended by Lars for in-depth evaluation, this task has no sub-tasks. Based on initial feedback from folks we have talked to, the title of this task indicates it is an important issue to be addressed. User Engagement: US-09-01a Identifying Synergies between Societal Benefit Areas This sub-task is recommended because it is a specific effort to formulate the core Earth Observation needs for a working GEOSS. Disasters: DI-09-03a Tsunami Early Warning System of Systems This sub-task was identified in Lars’ analysis for lack of activity. Additionally, there was great interest in this at the time of the formalization of GEOSS in 2005 and it continues to be a current topic.

  12. Task Selection Health: HE-09-01 Information Systems for Health This task has no sub-tasks and was identified in Lars’ analysis as a candidate for in-depth evaluation. This was chosen because the establishment of Earth Observation data needs and opportunities underlies progress in achieving health outcomes through GEOSS. Energy: EN-07-01 Management of Energy Sources There are no sub-tasks under energy. All three were recommended for in-depth evaluation by Lars’ analysis. This particular task was selected because it had relevance to multiple outcomes. Climate: CL-09-03a Global Integrated Global Carbon Observation (IGCO) This task was identified by Michel L as a candidate for in-depth analysis. It is the primary component of the overarching task.

  13. Task Selection Water: WA-06-07c Asia (Capacity Building for Water Resource Management) This specific sub-task is not overseen by a unifying “primary” component in the Water SBA; however, this subtask (of the 3 regional sub-tasks) was identified by Lars as lacking in reported activity. Weather: WE-06-03 TIGGE and the Development of a Global Interactive Forecast System for Weather This task is chosen because both weather activities received equal ratings in Lars’ analysis, but this has only a single component (no sub tasks) and a more concrete objective than the other weather tasks.

  14. Task Selection Ecosystems: EC-09-02a Impact of Tourism on Environmental and Socio-Economic Activities The sub-tasks under 09-02 are greatly varied and Lars’ analysis showed a lack of connections to target outcomes. This sub-task is proposed simply because of its “a” status- if another team member has a reason (such as expertise) to desire a different sub-task we are open to that suggestion. Agriculture: AG-07-03a Global Agricultural Monitoring System This is chosen because this is the primary “a” component of AG-07-03 which Lars’ analysis indicated had high relevance to Cape Town. Biodiversity: BI-07-01a Biodiversity Observation Network There is a single Overarching Task for biodiversity. GEO-BON is the primary component of this task and the other sub-tasks are clearly components of a comprehensive BON.

  15. Task Selection Questions? Discussion? Suggestions? Agreement?

  16. In-Depth Analysis Proposal Assignment- 2 analyses per member • Develop a “highlight” – short, less than 2 page, write-up of the activity. Tell the “story” of the task- what happened? What has worked well? What hasn’t? What is needed to reach their 2015 goals? • These will appear in an appendix or as “boxes” in the report. • Set of conclusions or generalized findings based on each task which will be synthesized in report text.

  17. In-Depth Analysis Sources- You are encouraged to use any of the following to develop the analysis • Data from secretariat interview questions about the sub-tasks. • Task sheets and progress reports • Communication with task leads and participants.

  18. In-Depth Analysis • Additional Notes- • We do not want to be conducting multiple interviews with each secretariat member so please ask about all selected sub-tasks in their areas even if you are not personally conducting the in-depth analysis of that sub-task. • Old task sheets may have a different task number associated with the sub-task. • We are not judging the individual tasks- we are looking at them to help understand how success is achieved in the GEOSS framework.

  19. In-Depth Analysis Questions? Discussion? Suggestions? Agreement?

More Related