1 / 37

Metrics Baseline Review

Project Metrics Baseline Review H. K. (Rama) Ramapriyan NASA / GSFC Greg Hunolt SGT, Inc. ES-DSWG / MPAR-WG October 21, 2009. Metrics Baseline Review. Purpose:

newman
Download Presentation

Metrics Baseline Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Project Metrics Baseline Review H. K. (Rama) RamapriyanNASA / GSFCGreg HunoltSGT, Inc.ES-DSWG / MPAR-WGOctober 21, 2009

  2. Metrics Baseline Review • Purpose: • To review the current metrics baseline (the set of metrics currently being collected and reported by MEaSUREs, REASoN, and ACCESS projects); • Determine whether changes / additions / deletions need to be made to the baseline to meet the needs of the MEaSUREs projects; • Develop MPAR-WG recommendations for any needed changes. • History: • Original metrics baseline recommended by MPAR-WG and approved by HQ for FY2004 for REASoN projects; • Changes to the baseline were recommended by the MPAR-WG and approved by HQ in 2006 for the FY2007 reporting year based on projects’ experience. • Principal changes were addition of voluntary Service Metrics and Project Defined metrics to provide needed flexibility for REASoN and ACCESS projects, plus the ‘Metrics Mission Statement’ and clarification of some definitions.

  3. Metrics Mission Statement To measure the success of each project in meeting its stated goals and objectives, to show the role and contribution of each project to the NASA science, application, and education programs, and to enable an overall assessment of the success of programs such as MEaSUREs / REASoN / ACCESS and their contribution to NASA’s goals. • We will step through the metrics one by one, and determine if each one should be retained as is, modified, or deleted. • As we step through the metrics, points raised during the October 6 telecon will be presented to the MPAR-WG. • We will then consider whether any new metric(s) should be added. • We will end up with MPAR-WG recommendations as needed. Metrics Baseline Review

  4. Overview – Three Types of Metrics • Products and Services Metrics: Measure the number and types of products and/or services provided by a project, with data volumes as applicable. • Common Metrics: Reported by most if not all projects, will be overall measures with sufficient cross-project commonality to allow assessment of the MEaSUREs (etc.) program as a whole, and will not be used as comparative measures of project performance. • Project-Defined Metrics: Projects may add one to four project defined (project-specific metrics), defined by each Project as best measures of its performance against its objectives. • Reporting: Common metrics will be reported monthly unless otherwise agreed between a project and its study manager. Project Defined metrics will be reported at interval chosen by the project. • Programmatic Metrics: Characterize the role of the project within the NASA science, applications, and/or education programs by indicating program areas the project supports. After initial report, update as needed. • Impact Metrics: Specific success stories that provide an example (s) of how the project’s products / services have directly benefited a user, organization, or activity it supports. Reported as opportunity allows.

  5. Introducing the Metrics Baseline Summary • For each metric, the following is reported: • The metric’s value. • A baseline comment - An explanation of how the metric is determined by the Project. Enter a baseline comment once and then it will remain attached to the metric until updated. Projects may change baseline comments each month or let the current baseline comment stand as long as it applies. • A supplemental Comment - An explanation qualifying or otherwise explaining a Project's metric entry. Use of this comment box is optional every month and should be used to enter comments applicable only to that month's value of the metric. • For each metric, the following will be presented: • Metric Name, Definition, Purpose in Collecting this Metric; • The metric question as it appears on the Metrics Collection Tool • Examples of how the metric is being reported currently, including by three MEaSUREs projects: • 336 - ESDR of Small Scale Kinematics of Arctic Ocean Sea Ice • 364 - Creating a Unified Airborne Database • 370 - Distributed Information Services: Climate/Ocean Products and Visualizations for Earth Research • Any concerns raised during the Oct 6 MEaSUREs telecon.

  6. Metrics List Products and Services Metrics - Common Metrics: • Metric 1 - Distinct Users • Metric 2 – Distinct Users by Class • Metric 3 – Products Distributed (instances of types available, see metric 4) • Metric 4 – Product Types Available • Metric 5 – Volume Distributed • Metric 6 – Volume Available • Metric 7 – no longer used • Metric 11 – Services Provided (instances of service types available) • Metric 12 – Services Available Project Defined (Project Specific) Metrics (up to 4, defined by each project, metrics 100, 101, 102, 103) Programmatic Metrics: • Metric 8 – Science Focus Areas Supported • Metric 9 – Applications Areas Supported • Metric 10 – Education categories Supported Impact Metrics

  7. Common Metric #1, Distinct Users • Purpose:To measure the size of the activity’s user community, to be assessed in the context of its NASA program role. • Website Question:Please enter the count of individuals who, by any means, request and receive or in some other way use products, services and/or other information during the reporting period. • MEaSUREs Telecon: No concerns raised.

  8. Common Metric #1, Distinct Users, Cont. • Examples: • 336: Value 23; Baseline Comment: The number of distinct users is the sum of those who downloaded data from the website, and those who either called or e-mailed for technical support. • 364: Value 5,936; Baseline Comment: Distinct users are determined based on unique ip address for data retrieved by ftp or http. For request by phones and email it is determined using the requestors name or email address. • 370: Value10,608; Baseline Comment: The number of unique individual users is calculated from FTP and WWW logs at RSS, and DataPool activity at UAH. The combined total represents unique website visitors as well as users who download data via FTP or DataPool interface.

  9. Common Metric #2, Distinct Users by Class • Purpose:To measure the types of users served by the activity, to be assessed in the context of its NASA program role. • Website Question:Please enter the number of users who obtain products and services from your project by the following classes. • MEaSUREs Telecon: No concerns raised.

  10. Common Metric #2, Distinct Users by Class, Cont. • Examples: • 336: counts by category as applicable; Baseline Comment: Based on Metric 1, we determine the characterization of Distinct Users using log files created by the web server. • 364: counts by category as applicable; Baseline Comment: Characterization of users is determined by resolving the ip address to a domain name to determine if it is a non-us domain, commercial, etc.. For non internet requests we determine this by using the address we send data/information to. K12 users are reported on line 2.13. • 370: countsby category as applicable; Baseline Comment: Characterization of unique individual users is calculated from FTP logs at RSS, and DataPool activity at UAH. The combined totals represent users who download data via FTP or DataPool interface. Web visitors are not included..

  11. Common Metric #3, Products Delivered • Purpose:To measure, in conjunction with items 4, 5, and 6, the data and information produced and distributed by the activity, to be assessed in the context of its NASA program role. A particular set of values for these metrics might be much smaller for one activity than another activity, but in each case could represent excellent performance, given the particular role of each activity. The count of products delivered is a useful measure given the user oriented definition of a ‘product’ that is independent of how the product is constituted or how large it is. • Website Question:Please enter the number of products provided to users during the reporting period. • MEaSUREs Telecon: Should the number of products be broken down by type (metric 4)?

  12. Common Metric #3, Products Delivered, Cont. • MEaSUREs Telecon Discussion: • General agreement that the number of products delivered (and instances of services provided) was a very useful measure of a project’s work – better than volume. • Capturing products delivered by product type would provide a better view of a project’s work than an overall count. • Would be most practical if the number of product types was small. • Projects would have and use this level of information internally, and so would have it available to provide. • Would this level of information be useful to users of these metrics? • Option: Recommend adding ‘broken down by product type (see metric 4)’ to definition. • Note: This level of detail will be available from the EMS once (most) MEaSUREs projects have transitioned archive and distribution to a DAAC.

  13. Common Metric #3, Products Delivered, Cont. • Examples: • 336: Value: 3,419; Baseline Comment: We calculate the number of delivered products using the log files created by the website server. • 364: Value: 386,542; Baseline Comment: A product is a prepared or on demand file, DVD, CD, Image, etc. that represents part of a dataset associated with an individual field campaign. • 370: Value: 409,108; Baseline Comment: Number of products delivered is calculated from FTP and WWW logs at RSS, and DataPool activity at UAH. The combined total represents number of files downloaded via FTP or DataPool interface, as well as page views of web visitors.

  14. Common Metric #4, Product Types • Purpose: The count of product types produced is a useful measure because of the effort by the activity required to develop and support each of its product types. A project’s values for this metric are to be assessed in the context of its NASA program role. • Website Question: Please enter the number of product types made available to users during the reporting period. • MEaSUREs Telecon: No concerns raised.

  15. Common Metric #4, Product Types, Cont. • Examples: • 336: Value:13; Baseline Comment: We count the number of products available on the website. 1. Lagrangian Ice Motion product 2. Backscatter Histogram product 3. Ice Age and Thickness product 4. Deformation product 5. Ice Age 6. Ice Thickness 7. Backscatter Histogram 8. Divergence 9. Vorticity 10. Shear 11. Sheba Ice Motion Product 12. Melt Onset Product 13. Canadian Arctic Shelf Exchange Study (CASES 2003-2004) product • 364: Value:19; Baseline Comment: A product type is an atmospheric chemistry dataset associated with an individual field campaign. • 370: Value: 82; Baseline Comment: At RSS, products include 5 geophysical parameters (sst, wind, vapor, cloud, rain) at 4 timeframes (daily, 3-day, weekly, monthly), as well as 1 SST OI and 1 SST swath data set. At UAH, 60 product types may be discovered by browsing the DataPool interface.

  16. Common Metric #5, Volume of Data Distributed • Purpose:The volume distributed is a useful output measure but one which depends heavily on the particular types of data an activity produces and distributes and must be assessed in the context of the activity’s role and data it works with. See note in metric 3. • Website Question:Please enter the volume of data and/or data and/or information provided as web downloads or hard media or otherwise distributed to users during the reporting period (in MB, GB or TB as appropriate, to three significant digits precision, e.g. "10.2 GB" as opposed to "10,186 MB"). If data is available but none was distributed then enter "0" in the entry box and explain in the comment box. If this Project has none available, then enter "na" in the entry box and 'not applicable' in the comment box. • MEaSUREs Telecon: Questioned usefulness, meaningfulness of volume.

  17. Common Metric #5, Volume of Data Distributed, Cont. MEaSUREs Telecon Discussion: There were mixed views as to the usefulness of volume distributed as a project metric… • Con: • Volume is ambiguous because of compression; • Even trend analysis for a given project may be impossible if the project changes compression, or if month to month data represent different mixes of large and small volume products. • A project’s effort to improve its service may often result in a decrease in volume provided, which would not be a negative trend. • Pro: • Even so, volume distributed is still a useful quantity to track as one component of a project’s work and for internal planning. • Could track average product size. • Projects can document (in baseline comments) changes in practice that affect the volume distributed. • Note: The discussion was very similar to the discussion of volume at the 2008 MPAR-WG – decision then was to leave it be.

  18. Common Metric #5, Volume of Data Distributed, Cont. • Examples: • 336: Value:10,894 MB; Baseline Comment: We calculate the volume of delivered products using the log files created by the website server. • 364: Value: 38,652 MB; Baseline Comment: Volume distributed is the amount of data the project has distributed for the month via http/ftp data files, CDs, and/or DVDs. • 370: Value: 1,551 GB; Baseline Comment: Volume of data distributed is calculated from FTP logs at RSS, and DataPool activity at UAH. The combined total represents the size of data downloaded via FTP or DataPool interface.

  19. Common Metric #6, Data Volume Available • Purpose:The cumulative volume available for users provides a measure of the total resource for users that the activity creates. See note in metric 4. • Website Question:Please enter the total cumulative volume, as of the end of the reporting period, of data and held by the project and available to researchers and other users (MB, GB or TB to three significant digits). This number can include data that are not on-line but are available through other means. If paper products, enter as pages (pp) or publications (pb). If this Project has no data available, then enter "na" in the entry box and 'not applicable' in the comment box. • MEaSUREs Telecon: Similar concerns as expressed for metric 5, Volume Distributed.

  20. Common Metric #6, Data Volume Available, Cont. • Examples: • 336: Value: 20,516 MB; Baseline Comment: We calculate the total volume of data available to the users by summing the product file sizes staged on the server disks. • 364: Value: 2,033,219 MB; Baseline Comment: Data available includes original video available on DVD or CD and data files that can be accessed via ftp and/or http. • 370: Value: 2,009 GB ; Baseline Comment: Volume of data available represents the size of data products on-line and publicly available at RSS and UAH. • Note – reporting MB, GB, or TB to three significant digits would be best, i.e. 20.5 GB instead of 20,516 MB and 2.03 TB instead of 2,033,219 MB.

  21. Common Metric #11, Services Provided • Purpose:To measure, in conjunction with metric 12 the services performed by the activity, to be assessed in the context of its ESE role. A particular set of values for these metrics might be much smaller for one activity than another activity, but in each case could represent excellent performance, given the particular ESE role of each activity. • Website Question:Please enter the number of services provided to users during the reporting period. • MEaSUREs Telecon: Consider breaking down count by service type – see metric 3.

  22. Common Metric #11, Services Provided, Cont. Examples: • Value: 1128; Baseline Comment: Number of interactive maps served to outside users. • Value: 342; Baseline Comment: The number of plots created dynamically via our website. • Value: 294, Baseline Comment: Execution of trend analysis

  23. Common Metric #12, Service Types • Purpose: The count of service types produced is a useful measure because of the effort by the activity required to develop and support each of its service types. A REASoN’s values for this metric are to be assessed in the context of its NASA program role. • Website Question:Please enter the number of service types made available to users during the reporting period. • MEaSUREs Telecon: No concerns raised.

  24. Common Metric #12, Service Types, Cont. Examples: • Value: 4; Baseline Comment: The four basics service types are: 1) Searching for glacier data via interactive maps 2) Searching for glacier data via text fields (e.g. by name) 3) Searching for ASTER imagery via interactive maps. 4) Submitting data to the GLIMS project via a web form • Value: 29; Baseline Comment: Number of distinct data sets currently available for online plotting tool. • Value: 5; Baseline Comment: Our Service types include customer support, merges of data sets, online plotting, statistical analysis, and reprojection and reformatting of data. The iteration of a service type generally results in products that we include in our products provided metrics. • Value: 3; Baseline Comment: Services provided include a Web Server, an OpenDap Server, and the WIPE server where products can be accessed. • Value: 3; Baseline Comment: FTP download service. OPeNDAP. High-Efficiency File Transfer (HEFT) through Aspera. • Value: 1, Baseline Comment: Trend analysis is now available

  25. Project Defined Metrics Examples To provide additional significant detail: • Value: 83,114; Baseline Comment: Total number of glacier snapshots in glacier_dynamic system. • Value: 7,296,634; Baseline Comment: Total number of valid glacier vertices in database • Value: 226,556; Baseline Comment: Total number of ASTER footprints in database To measure an activity of importance to the project’s goals: • Value: 6, Baseline Comment: Partners using DIAL technology: will grow as the project advances. • Value: 65; Baseline Comment: Number of countries served by ipydis is an important statistic for this internationally focused project. To highlight a significant ancillary activity: • Value: [list of papers, etc.]; Baseline Comment: Project publications and presentations. MEaSUREs Telecon: Consider adding Citations as a project-defined metric.

  26. Citations as a Project Defined Metric MEaSUREs Telecon Discussion: • Citations seen as a very good measure of the project’s contribution to research, and user satisfaction. • One project performs a search of publications’ websites every six months to identify and count citations. • Use of ‘digital object identifiers’ was suggested as means to track citations. • The difficulty DAACs have had with citations was noted. • The suggestion was made that all of the MEaSUREs projects press users to cite their products when the publish work that used them. • Citations may not be appropriate for all projects. • Options: • Encourage projects to report citations as a project-defined metric. Projects to share ideas on how best to collect. • Create a new common metric, Citations, that would be voluntary – would require an MPAR-WG recommendation. • In either case, allow the project to determine the reporting interval, e.g. six months.

  27. Programmatic Metric #8, Support for Science Focus Areas • Baseline Definition:The projects will identify the NASA Science Mission Directorate's Science Focus Areas that each project supports (may be multiple). The focus areas are: weather, climate change and variability, atmospheric composition, water and energy cycle, Earth surface and interior, and carbon cycle and ecosystems. (simplified form) • Purpose:To enable the ESE program office to determine which NASA Science Mission Directorate's Science Focus Areas are supported by the activity, and to assess how the data products provided by the activity relate to that support. • Website Question:Please identify the NASA Science Mission Directorate's Science Focus Areas that your project supports (may be multiple). • Categories from NASA Science Mission Directorate's Science Focus Areas [list] • MEaSUREs Telecon: No concerns raised. • Definition shown above is simplified; • Eliminates asking for count of users for each area; • Few projects ever even attempted to do this; • Recommend adoption of this form.

  28. Programmatic Metric #9, Support for Applications Areas • Baseline Definition:The projects will identify the NASA Science Mission Directorate's Applications of National Importance that each project supports (may be multiple). The 12 applications areas are: agricultural efficiency, air quality, aviation safety, carbon management, coastal management, ecosystems, disaster preparedness, energy forecasting, homeland security, invasive species, public health, and water management. • Purpose:To enable the ESE program office to determine which NASA Science Mission Directorate's Applications of National Importance are supported by the activity, and to assess how the data products provided by the activity relate to that support. • Website Question:Please identify the NASA Science Mission Directorate's Applications of National Importance that your project supports (may be multiple). • Categories from NASA Science Mission Directorate's Application of National Importance: [list] • MEaSUREs Telecon: No concerns raised. • Definition shown above is simplified; • Eliminates asking for count of users for each area; • Few projects ever even attempted to do this; • Recommend adoption of this form.

  29. Programmatic Metric #10, Support for Education Initiatives • Baseline Definition:The projects will identify the NASA education categories that each supports. These six categories are Elementary and Secondary Education, Higher Education, Underrepresented and Underserved, e-Education, and Informal Education. • Purpose:To enable the ESE program office to assess support provided by the activity to NASA Science Mission Directorate's education initiatives, by indicating use by education user groups of the activity’s products and services. • Website Question:Please identify the NASA education categories that your project supports (may be multiple). [list] • MEaSUREs Telecon: No concerns raised. • Definition shown above is simplified; • Eliminates asking for count of users for each area; • Few projects ever even attempted to do this; • Recommend adoption of this form.

  30. Impact Metrics – Success Stories Impact Metrics Include: • A one page narrative describing the success story, i.e. how the project’s products / services have directly benefited a user, organization, or activity the project supports. • The narrative might include: • The name of the product and/or service, in the title if possible. • The direct benefit of using the product or service, the impact, in a prominent manner; in the title if possible. • Any collaboration with local groups, NGO's, businesses, or federal agencies. • How working with NASA helped make this happen. • A JPEG image or graphic that provides a visual complement to the narrative description. • A large (pixel-wise) image can be accommodated on the MCT website – a small image will appear within the impact metric that the user can click on to see the full resolution image. • MEaSUREs Telecon: Discussed audience and changes…

  31. Impact Metrics – Discussion MEaSUREs Telecon Discussion: • Audience for impact metrics: • Rama selects some impact metrics and forwards them to Martha Maiden, • Martha sends some of those ahead to her management. • In some cases projects are asked for additional information. • One project reported seeing an impact metric appear in a NASA publication as a result of this process. • Change to form / format of impact metrics being considered • Follows up on action item from 2008 MPAR-WG meeting and Frank Lindsay (then of HQ) comments. • Options being considered: • Leave them as is • Adopt a quad-chart format as is used in E-Books • Think of a new Powerpoint format

  32. Impact Metric: DISCOVER REASoN, PI: Frank Wentz, May, 2008

  33. Impact Metric: DISCOVER MEaSUREs, PI: Frank Wentz, May, 2009

  34. Impact Metric: AMAPS ACCESS, PI: Amy Braverman, September, 2007 (Aerosol Management and Processing System)

  35. Impact Metric: DIAL ACCESS, PI: Bruce Carron, January, 2007

  36. Back Up

  37. Current Text – Metrics 8 and 9 Metric 8 - Support for the ESE Science Focus Areas (when applicable) • The REASoN projects will include a quantitative summary of the data products supporting one or more of NASA’s science focus areas, and report any changes at the next monthly metrics submission. The focus areas are: weather, climate change and variability, atmospheric composition, water and energy cycle, Earth surface and interior, and carbon cycle and ecosystems. Metric 9 - Support for the ESE Applications of National Importance (when applicable) • The REASoN projects will include a quantitative summary of the data products supporting one or more of NASA’s Applications, , and report any changes at the next monthly metrics submission.. The 12 applications areas are: agricultural efficiency, air quality, aviation safety, carbon management, coastal management, ecosystems, disaster preparedness, energy forecasting, homeland security, invasive species, public health, and water management.

More Related