90 likes | 180 Views
This report covers the planning and development of product quality metrics, recommendations for project and DAAC checklists, and the review of citation metrics for FY2011 and FY2012. It outlines next steps and future work for the Metrics Planning Group.
E N D
Metrics Planning Group (MPG) Report to Plenary Clyde Brown ESDSWG Nov 3, 2011
Product Quality Metrics • Overall Objective: Given that the objective of the MEaSUREs program is to produce and deliver readily usable Earth Science Data Records of high quality: • Define program level metric(s) that permit assessment of the steps taken / progress made to ensure that high quality products are provided by MEaSUREs projects and the MEaSUREs program as a whole. • Develop a draft recommendation for Product Quality Metric(s) that would then go through the regular MPARWG review process. • Recommendation from the October 2010 [ESDSWG] meeting: • Develop a checklist of a small number of questions that represent progress in the product quality area. • We considered product quality to be a combination of scientific quality and the completeness of associated documentation and ancillary information, and effectiveness of supporting services. • The responsibility for the product quality is shared between the projects generating the ESDRs and the DAACs that eventually archive and distribute them.
Product Quality Metrics • Completed work on questions / checklists, reach agreement on a first version to work with • Next steps • Projects & DAACs compile initial set of checklists, P. I.s send to Rama. • Rama creates a strawman set of project level summary roll-ups, and an aggregated program level roll-up, sends back to P.I.s. • Telecon to discuss, modify, etc., the summary roll-ups. • Draft MPARWG recommendation for product quality metrics (i.e., the agreed summary roll-ups).
Draft Project Checklist Per a reviewer recommendation, responses to all questions could include comments, and would use a text format such as Word that would facilitate commenting.
Draft DAAC Checklist Per a reviewer recommendation, responses to all questions could include comments, and would use a text format such as Word that would facilitate commenting.
Citations Metrics • A change to the baseline adding new Citations Metrics was recommended by the MPARWG in 2009 and approved by NASA HQ in October 2010 for the FY2011 reporting year. • NASA HQ requested a report on the first year of citations metrics. • The expectation was that MEaSUREs projects that are REASoN continuations would be in the best position to begin reporting Citations Metrics in 2011. • By September 30, 14 projects reported on citations. • 6 of the 7 MEaSUREs projects that are REASON continuations reported. • 8 new MEaSUREs projects reported. • 1,708 citations in peer-reviewed publications were reported (excluding ISCCP), and 235 citations in non-peer-reviewed publications. • The goal of this session was to examine the experience and lessons learned from the first year effort, and chart a course for citations metrics reporting in 2012. • The report to NASA HQ will reflect the results of the first year of citations metrics reporting and the way forward agreed to here.
Citations Metrics • Reviewed citation metrics for FY2011 and the methods used by the projects to identify best practices and assess level of effort • Next Steps: • Develop guidance for projects based on this year’s experience and results of our discussion. • Citation Metrics for FY2012 will be collected by September 30 to allow for annual report to NASA HQ
Future Work • MPG will continue to function on an ad hoc basis to consider metrics issues as they arise, e.g. • Metrics for Distributed Services • Ensuring that data access by online services are accounted for (e.g. which data granule(s) were accessed to produce a plot)