1 / 12

MPARWG Business & Disposition of Action Items from MPARWG October 2009 H. K. (Rama) Ramapriyan

MPARWG Business & Disposition of Action Items from MPARWG October 2009 H. K. (Rama) Ramapriyan NASA/GSFC Metrics Planning and Reporting (MPAR) WG 9 th Earth Science Data Systems Working Group New Orleans, LA October, 2010. MPARWG Business Meeting.

Download Presentation

MPARWG Business & Disposition of Action Items from MPARWG October 2009 H. K. (Rama) Ramapriyan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MPARWG Business & Disposition of Action Items from MPARWG October 2009 H. K. (Rama) Ramapriyan NASA/GSFC Metrics Planning and Reporting (MPAR) WG 9th Earth Science Data Systems Working Group New Orleans, LA October, 2010

  2. MPARWG Business Meeting • Disposition of Last Year’s Action Items • Summary of FY 2010 Metrics Reporting – by Projects and to NASA HQ • ACCESS Project Metrics and Reporting to NASA HQ (Steve Berrick) • Service Metrics Case (Robert Wolfe and Chris Doescher) • EMS Overview and Update (Lafit Wanchoo & Kevin Murphy) • Implementation of Citation Metrics • Action Items from this meeting

  3. MPAR Working Group – Disposition of 2009 Action Items • Action 2009-1: Greg Hunolt to develop table of program level aggregate statistics for Rama to review and post. • Aggregate statistics were included in the MPARWG chapter of the EDSWG 2009 Year-End Report. • Beginning in May, 2010, aggregate metrics for the MEaSUREs program have been reported to Martha Maiden on a monthly basis. • Action 2009-2: Greg Hunolt to review ACSI survey and suggest possibilities. • The intent was to see if the questions on the survey suggested possibilities for additional metrics. • There does not seem to be anything to add as program level metrics given the MPARWG’s work in progress on citation metrics and product quality metrics, and efforts to improve reporting to NASA HQ, e.g. the new format for impact metrics and the monthly MEaSUREs aggregate and highlight report. • The ACSI survey allowed users to enter free text comments / recommendations, which the DAACs have found useful. A possibility to consider would be for projects to set up a means for their users to do the same thing. • Action 2009-3: Rama to finalize and distribute MEaSUREs telecon minutes. • This refers to the telecon held on October 6, 2009. MEaSUREs projects discussed the current metrics baseline and made no recommendations for changes to the existing metrics. But they did recommend that the MPARWG consider adding a citations metric. We did include the telecon results in the October 2009 metrics baseline review presentation.

  4. MPAR Working Group – Disposition of 2009 Action Items • Action 2009-4: Greg Hunolt to advise projects that they have the option of breaking down the count of products delivered by product type, and illustrate how this can be done using the MCT. In like manner for volume by product type, services provided by service type. • Email sent to all MEaSUREs projects on November 3, 2009. Greg will continue to follow up with projects. • Action 2009-5: Greg Hunolt to provide examples of service types and encourage MEaSUREs projects to make more use of service metrics 11and 12 (services provided and service types available). • Email sent to all MEaSUREs projects on November 3, 2009. Greg will continue to follow up with projects. These two action items raise an old question – how hard does the MPARWG want to push projects to implement voluntary metrics?

  5. BACK-UP

  6. MPAR Mission Statements • Mission Statement for the MPARWG • Review and recommend program-level performance metrics and collection tools that measure how well each data activity supports the NASA Science Mission Directorate’s Earth science, application and education programs. • Metrics Mission Statement • To measure the success of each project in meeting its stated goals and objectives, to show the role and contribution of each project to the NASA science, application, and education programs, and to enable an overall assessment of the success of programs such as REASoN / ACCESS / MEaSUREs and their contribution to NASA’s goals.

  7. MPAR Working Group • Membership in WG • WG membership open to NASA data and service provider community (REASoN, ACCESS and MEaSUREs projects, DAACs, SIPSs, etc.). • MEaSUREs projects PI’s and / or metrics points of contact are encouraged to join the MPARWG and participate in its activities. • We are open to suggestions for participation by others. • Scope of Work • WG provides on-going MPAR review, evaluation, recommendations and metrics evolution for the NASA Earth Science (ES) data and service provider community. • WG recommends additions, deletions or modifications to set of metrics. Recommendations may be approved, modified or rejected by NASA HQ. If approved, NASA Science Mission Directorate (SMD) funded ES data and service providers will have to make recommended changes in their reporting. • WG recommends metrics collection and reporting tools, changes / improvements to those tools, etc.

  8. MPAR-WG Recommendations Approved Two Recommendations Approved by HQ, Oct 2009: • MPARWG Recommendation #5: Citation Count Metrics • Objective: to obtain a better measure of user satisfaction with project’s products and services and enable a better assessment of their contribution to the NASA science and applications programs and Earth science research in general. • Implementation to be discussed here (see Breakout Topics). • MPARWG Recommendation #6: Quad-Chart Format for Impact Metrics • Objective: to provide the impact metric information to NASA Headquarters in a form and format consistent with Headquarters practice, and by doing so to improve the communication by the projects of their significant accomplishments to NASA Headquarters. • Implementation to be discussed here (see Breakout Topics).

  9. Players and Roles (1 of 2) • MPARWG - Review and recommend program-level performance metrics and collection tools • H. (Rama) Ramapriyan (GSFC) and Clyde Brown (LaRC / SSAI) – Co-Chairs – set agenda, conduct business, gather and forward recommendations to Frank Lindsay for transmission to NASA HQ. • Frank Lindsay (GSFC) – Overall coordination of ES-DSWG activities, reporting to HQ. • Martha Maiden (HQ) – Review recommendations, coordinate with HQ Program Managers, Program Scientists, Study Managers. Approve, disapprove or modify recommendations. • Steve Berrick (HQ) – Program Manager for ACCESS projects. Sets policies, reporting requirements. • MPARWG members – participate (vigorously) in discussions preceding recommendations • Greg Hunolt (SGT) – support co-chairs in conducting business (e.g., keep minutes of meetings/telecons, “shepherd” recommendations through MPARWG voting process and prepare for approval)

  10. Players and Roles (2 of 2) • Metrics Collection and Reporting: • H. (Rama) Ramapriyan (GSFC) – Lead – Ensure collection of metrics from all activities (MEaSUREs / REASoN / ACCESS), development of summary reports required by HQ. • Randy Barth (ADNET) – Maintain web-based metrics collection tool (MCT) originally developed by UMD. • Kevin Murphy (GSFC), Natalie Pressley (SGT) – Support MEaSUREs transition to metrics reporting via EMS (ESDIS Metrics System). • Frank Lindsay (GSFC) – Liaison with HQ for all ES-DSWG activities. • Martha Maiden (HQ) – Coordinate with HQ Program Managers, Program Scientists, Study Managers. Set policy, establish reporting requirements, liaison with OMB to provide overall summary reports of all NASA Earth science data system activities. • Steve Berrick (HQ) – ACCESS Program Manager – Sets policy for ACCESSes, establish reporting requirements. • MEaSUREs / REASoN / ACCESS Projects – Enter reports using metrics tools. • Greg Hunolt (SGT) – Support Rama in contacting projects, resolving any technical issues, ensuring consistency, generating summary reports, and reporting status. • Jody Garner (ADNET) – Maintain operational web-site to collect metrics; provide technical help with web-site to reporting projects.

  11. MPAR WG – Recommendation Processes MPAR WG Process to adopt a recommendation: • Majority vote of MPAR WG members to adopt proposed recommendation as a WG draft; • One MPAR WG member appointed “shepherd”; • 30 day period of NASA ES activity review for WG draft (not all ES activities are represented by MPAR WG members) coordinated by shepherd; • Shepherd assembles comments, drafts revisions to recommendation per activity feedback, presents summary of feedback and draft revisions to full WG; • WG considers revisions and need for ‘beta test’; • Majority vote of MPAR WG members to adopt revised WG draft; • Shepherd coordinates Impact Analysis, Rationale, Justification • Two thirds vote of responding MPAR WG members to adopt final recommendation package and send to HQ / SMD.

  12. MPARWG Membership as of October 1, 2009

More Related