1 / 39

NASA

NASA . Earth Observing System Data and Information System Customer Satisfaction Results November 29, 2011. Today’s Discussion. Background Overview Key Results Detailed Analysis Summary. Background. Project Background Objectives.

kylee
Download Presentation

NASA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NASA Earth Observing System Data and Information System Customer Satisfaction Results November 29, 2011

  2. Today’s Discussion • Background • Overview Key Results • Detailed Analysis • Summary

  3. Background

  4. Project BackgroundObjectives • Measure customer satisfaction with the NASA Earth Observing System Data and Information System at a national level and for each Data Center • Alaska Satellite Facility Distributed Active Archive Center • Crustal Dynamics Data Information System • Global Hydrology Resource Center • Goddard Earth Sciences Data and Information Services Center • Land Processes Distributed Active Archive Center • MODAPS Level-1 Atmospheres Archive and Distribution System • NASA Langley Atmospheric Science Data Center • National Snow and Ice Data Center Distributed Active Archive Center • Oak Ridge National Laboratory Distributed Active Archive Center • Ocean Biology Processing Group • Physical Oceanography Distributed Active Archive Center Jet Propulsion Laboratories (JPL) • Socioeconomic Data and Applications Center • Assess the trends in satisfaction with NASA EOSDIS specifically in the following key areas: • Product Search • Product Selection and Order • Delivery • Product Quality • Product Documentation • Customer Support • Identify the key areas that NASA can leverage across the Data Centers to continuously improve its service to its users

  5. Project BackgroundMeasurement timetable

  6. Project BackgroundData collection Those who answered for more than one data center: Two: 103 Three: 14 Four: 2 Respondents • 3,996 responses were received • 3,996 responses were used for modeling E-mail addresses from lists associated with some of the data centers were included to reach the large number of users who may have accessed data via anonymous ftp.

  7. Project BackgroundRespondent information For which specific areas do you need or use Earth science data and services? Demographics (when comparable) remain fairly consistent with 2010. * Multi-select question; Answer choices added in 2010 and 2011; Language to question was changed slightly in 2009; Modeling was asked as a separate question prior to 2008

  8. Project BackgroundRespondent information Demographics (when comparable) remain fairly consistent with 2010. * Questionnaire was modified in 2009-2011; Prior to 2010 WIST also included EDG. WIST became available in 2005. EDG was decommissioned Feb. 2008 when all data could be accessed through WIST.

  9. Overview Key Results

  10. NASA EOSDISCustomer satisfaction remains steady N=1016 N=1263 N=2291 N=4390 N=3996 N=2857 N=2601 N=3842 2006 2010 2011 2004 2005 2007 2008 2009 ACSI 75 78 74 75 77 77 77 77 (+/-) 0.9 (+/-) 0.7 (+/-) 0.5 (+/-) 0.6 (+/-) 0.5 (+/-) 0.4 (+/-) 0.4 (+/-) 0.4 Overall satisfaction How satisfied are you with the data products and services provided by [DAAC]? 80 81 82 78 81 81 79 81 Expectations To what extent have data products and services provided by [DAAC] fallen short of or exceeded expectations? 74 74 73 71 73 73 74 73 Ideal How close does [DAAC] come to the ideal organization? 76 71 72 73 75 75 75 75

  11. NASA EOSDIS Benchmarks Strong performance continues … ACSI (Overall) Q2 2011 76 65 Federal Government (Overall) 2010 NASA EOSDIS - Aggregate 2011 77 News & Information Sites 75 (Public Sector) 2011 30 40 50 60 70 80 ACSI (Overall) is updated on a quarterly basis, with specific industries/sectors measured annually. Federal Government (Overall) is updated on an annual basis and data collection is done in Q3. Quarterly scores are based on a calendar timeframe: Q1- Jan through March; Q2 – April through June; Q3 – July through Sept.; Q4 – Oct. through Dec.

  12. 86 Customer Support 1.7 76 Product Documentation 0.9 77 Product Selection and Order 1.1 75 Product Search 0.9 78 Product Quality 0.4 81 Delivery 0.4 NASA EOSDIS ModelProduct Search/Selection/Documentation most critical 87 Recommend 77 Customer Satisfaction Index 3.8 89 Future Use 3.2 Sample Size: 3996 Scores The performance of each component on a 0 to 100 scale. Component scores are made up of the weighted average of the corresponding survey questions. Impacts The change in target variable that results from a five point change in a component score. For example, a 5-point gain in Product Search would yield a 0.9-point improvement in Satisfaction.

  13. NASA EOSDIS 2008 – 2011 Scores hold steady; no change more than one point 77 Customer Satisfaction Index 77 (+/-) 0.4 77 77 86 86 (+/-) 0.9 Customer Support 85 84 81 80 Delivery (+/-) 0.5 81 81 78 77 (+/-) 0.6 Product Quality 77 74 77 Product Selection and Order 77 (+/-) 0.5 76 77 76 76 Product Documentation (+/-) 0.5 77 75 75 76 Product Search (+/-) 0.5 75 75 2011 2010 2009 2008 =Significant Difference vs. 2010

  14. Areas of Opportunity for NASA EOSDIS Remain consistent year over year Top Improvement Priority Product Search (75) Product Selection and Order (77) Product Documentation (76)

  15. Detailed Analysis

  16. Score ComparisonSame CSI inside and outside the USA 71% of respondents are outside of the USA in 2011 vs. 73% in 2010. Respondents inside and outside the USA have the same Satisfaction with EOSDIS (77). USA customers rated Delivery and Customer Support higher than those outside USA.

  17. CSI by Data Centers – 2008-2011Three data centers show significant score changes CDDIS (83) and PO.DAAC-JPL (82) have highest satisfaction. 77 (+/-) 1.7 75 ASDC - LaRC 76 77 78 (+/-) 2.3 ASF SAR DAAC 74 75 75 83 79 CDDIS (+/-) 2.4 80 88 80 GES DISC 80 (+/-) 2.9 77 77 80 GHRC (+/-) 3.2 79 79 78 76 LP DAAC 76 (+/-) 0.6 75 76 78 MODAPS/LAADS (+/-) 1.1 77 77 75 76 (+/-) 1.2 NSIDC DAAC 77 77 76 81 82 OBPG/Ocean Color (+/-) 1.9 81 80 75 78 ORNL DAAC/FLUXNET (+/-) 1.7 77 75 82 80 PO.DAAC - JPL (+/-) 2.3 78 79 71 69 SEDAC (+/-) 2.6 70 70 2011 2010 2009 2008 =Significant Difference vs. 2010

  18. Product SearchRemains a key driver of satisfaction and is top priority 60% used data center’s or data-specific specialized search, online holdings or datapool (49% in 2010) 14% used WIST to search for data and products (17% in 2010) 15% selected Internet Search Tool (16% in 2010) 75 76 Product Search 75 75 77 How well the search results met your needs 78 78 77 75 75 Ease of finding data 74 74 74 74 Ease of using search capability 74 74 =Significant Difference vs. 2010 Impact=0.9 2011 2010 2009 2008

  19. Product Search Score ComparisonBy method for most recent search How did you search for the data products or services you were seeking? 60% 3% 1% 15% 14% 3% 76 Data center’s or data-specific specialized search, onlineholdings or datapool (+/-) 0.6 78 78 76 76 77 (+/-) 3.0 Direct interaction with user services personnel 77 75 73 74 (+/-) 5.6 Global Change Master Directory 74 70 69 69 (+/-) 1.4 Internet search tool 70 68 75 Reverb/Warehouse Inventory Search Tool (WIST) 76 (+/-) 1.3 75 75 77 72 (+/-) 3.1 Other 77 76 2011 2010 2009 2008

  20. Product Search Scores by Data Center; variation in the trends GES DISC (81) and GHRC (80) rate Product Search highest. 76 76 (+/-) 2.2 ASDC - LaRC 77 77 75 74 (+/-) 2.9 ASF SAR DAAC 76 73 77 (+/-) 4.5 75 CDDIS 78 85 81 (+/-) 3.2 79 GES DISC 71 78 80 (+/-) 3.3 76 GHRC 77 77 74 (+/-) 0.7 LP DAAC 75 74 75 78 MODAPS/LAADS (+/-) 1.4 77 77 75 71 (+/-) 1.6 75 NSIDC DAAC 75 74 79 81 (+/-) 2.2 OBPG/Ocean Color 80 80 74 77 ORNL DAAC/FLUXNET (+/-) 2.0 75 72 76 77 PO.DAAC - JPL (+/-) 2.6 78 75 69 69 SEDAC (+/-) 3.2 67 66 2011 2010 2009 2008 =Significant Difference vs. 2010

  21. Product Selection and Order Also a top opportunity for improvements 93% of respondents said that they are finding what they want in terms of type, format, time series, etc. (94% in 2010) Did you use a sub-setting tool? 32% said No 45% said Yes, by geographic area 3% said Yes, by geophysical parameter 17% said Yes, by both geographic area and geophysical parameter 3% said Yes, by band 1% said Yes, by channel 77 77 Product Selection and Order 76 77 78 78 Ease of requesting or ordering data products 77 78 77 77 Ease of selecting data products 75 76 75 75 Description of data products 75 75 2011 2010 2009 2008 Impact=1.1

  22. Product Selection and Order Scores by Data Center 77 CDDIS (83) and GHRC (82) rate Product Selection and Order highest. (+/-) 2.0 75 ASDC - LaRC 76 76 76 (+/-) 3.0 75 ASF SAR DAAC 76 72 83 (+/-) 3.0 77 CDDIS 76 84 79 (+/-) 3.0 80 GES DISC 73 79 82 (+/-) 3.3 82 GHRC 77 76 76 (+/-) 0.7 76 LP DAAC 74 76 78 (+/-) 1.3 78 MODAPS/LAADS 77 76 74 (+/-) 1.5 76 NSIDC DAAC 78 75 80 (+/-) 2.1 81 OBPG/Ocean Color 81 81 76 79 (+/-) 2.0 ORNL DAAC/FLUXNET 76 75 81 80 (+/-) 2.5 PO.DAAC - JPL 78 79 71 70 SEDAC (+/-) 3.2 71 67 2011 2010 2009 2008 =Significant Difference vs. 2010

  23. Product DocumentationData product description remains most sought after CSI for those whose documentation was not found is 68 vs. those who got it delivered with the data (79) or online (78). Was the documentation… Delivered with the data (17% vs. 18% in ‘10) Available online (76% vs. 75% in ‘10) Not found (7% vs. 7% in ‘10) 76 76 Product Documentation What documentation did you use or were you looking for? Data product description 78% Product format 66% Science algorithm 45% Instrument specifications 42% Tools 37% Science applications 28% Production code 10% 77 75 76 Data documentation helped you use the data 76 77 75 76 76 Overall quality of the document 76 74 2011 2010 2009 2008 Impact=0.9

  24. Product DocumentationScores by data center 5 data centers rate Product Documentation 78 or 79. 75 75 ASDC - LaRC (+/-) 2.3 78 76 75 74 ASF SAR DAAC (+/-) 2.9 77 72 79 79 CDDIS (+/-) 5.2 82 86 78 GES DISC 78 (+/-) 3.3 75 76 78 GHRC 80 (+/-) 3.9 80 77 77 LP DAAC 76 (+/-) 0.8 76 74 75 MODAPS/LAADS 75 (+/-) 1.6 76 72 74 76 NSIDC DAAC (+/-) 1.6 76 72 78 OBPG/Ocean Color 80 (+/-) 2.6 77 77 75 ORNL DAAC/FLUXNET 79 76 (+/-) 2.0 71 78 80 PO.DAAC - JPL 81 (+/-) 3.1 79 76 SEDAC 72 72 (+/-) 3.4 73 2011 2010 2009 2008 =Significant Difference vs. 2010

  25. Customer SupportMaintain great performance 91% (88% in 2010) were able to get help on first request. These respondents continue to have a significantly higher CSI (81) than those who did not (66). Did you request assistance from the Data Center’s user services staff during the past year? No=76%. Of those who said yes, 80% used email, 2% used the phone, and 10% used both phone and email. 86 86 Customer Support 85 84 88 Professionalism 87 87 86 87 87 Technical knowledge 86 84 87 Accuracy of information provided 86 86 84 86 Helpfulness in selecting data or products 85 85 83 85 84 Timeliness of response 83 83 85 Helpfulness in correcting a problem 84 83 82 2011 2010 2009 2008 Impact=1.7

  26. Product QualityPreferences somewhat in line with what provided In 2010, 57% said products were provided in HDF-EOS and HDF and 42% said they were their preferred method. GeoTIFF is most preferred format, while HDF-EOS/HDF is format in which products were provided the most. Only 8% of products provided in GIS although nearly one-quarter prefer that format. ~Multiple responses allowed

  27. Product QualityOne-point gain from last year 78 77 Product Quality 77 74 78 77 Ease of using the data product in the delivered format 77 74 2011 2010 2009 2008 =Significant Difference vs. 2010 Impact=0.4

  28. DeliveryTimeliness and Delivery up one point Over half said their data came from MODIS (same as 2010); 32% said ASTER (28% in 2010) MODIS (Atmosphere): 22% MODIS (Cryosphere): 8% MODIS (Land): 52% MODIS (Ocean):18% *Question is multi-select 81 80 Delivery 81 81 82 Convenience of delivery method 82 82 83 80 79 Timeliness of delivery method 79 79 2011 2010 2009 2008 =Significant Difference vs. 2010 Impact=0.4

  29. DeliveryMethods for receiving … FTP immediate retrieval from online holdings is most preferred but retrieved after order is most used. 67% said FTP was their preferred method in 2010 How long did it take to receive your data products? 23% immediate retrieve CSI=81 23% less than 1 hour CSI=78 26% less than a day CSI=76 22% 1-3 days CSI=77 4% 4-7 days CSI=74 2% more than 7 days CSI=66

  30. Summary

  31. Summary • Satisfaction with NASA EOSDIS has held at 77 for four years. NASA continues to meet data users needs. • As would be expected with consistent satisfaction scores, there were very few changes in drivers’ scores. Half of the drivers had no change and any changes in satisfaction drivers were no more than one point. • Delivery and Product Quality improved one point, while Product Search was down one point. • There was no change in Customer Support, Product Selection and Order, and Product Documentation. • However, due to the large sample size a one-point change is statistically significant at a 90% confidence level. • While scores are solid, there are opportunities to improve. Product Search, Selection and Order, and Documentation continue to be the top priorities. • Work to refine and improve the search capabilities and functionality. • Continue to clarify descriptions of data products, and make language easy to understand. • As some respondents are still having difficulty locating documentation, continue to work on both providing accessible and clear documentation that is readily available for users.

  32. Summary • Customer Support remains the top scoring area. As it is also the highest impact area, it is important to maintain the great level of service and support already being provided. • Share the importance and impact of customer support with those providing it, to help boost awareness. • Look for areas of best practice among top performing data centers, to ensure all centers are providing high levels of service. • GeoTIFF and GIS appear to have a higher preference by customers than what was provided. NASA should explore offering more data products in these formats.

  33. Appendix

  34. Customers over multiple yearsWho have answered the survey multiple years … No significant differences were seen between 2010 and 2011 for those who have answered the survey over the last four years. For those answering the survey over multiple years, score movement is mixed. (Difference refers to 2011 vs. 2010)

  35. Customers over the past three yearsWho answered the survey in 2009, 2010 and 2011 For those answering the survey in 2009, 2010 and 2011, there are no statistically significant score differences. (Difference refers to 2011 vs. 2010)

  36. Customers over the past two years Who answered the survey in 2010 and 2011 For those answering the survey in 2011 and 2010, there are a number of statistically significant positive score differences. (Difference refers to 2011 vs. 2010)

  37. x1 lx1 y1 ly1 lx2 x2 x1 b1 ly2 x3 lx3 y2 h1 x4 lx4 ly3 y3 b2 lx5 x5 x2 x6 lx6 = l x + d x , for i = 1,2,3 t = 1,2 i xi t i = l h + e y , for j = 1,2,3 1 j yj j h = b x + b x + z 1 1 1 2 2 1 The Math Behind the Numbers A discussion for a later date…or following this presentation for those who are interested.

  38. A Note About Score Calculation • Attributes (questions on the survey) are typically answered on a 1-10 scale • Social science research shows 7-10 response categories are optimal • Customers are familiar with a 10 point scale • Before being reported, scores are transformed from a 1-10 to a 0-100 scale • The transformation is strictly algebraic; e.g. • The 0-100 scale simplifies reporting: • Often no need to report many, if any, decimal places • 0-100 scale is useful as a management tool

  39. Deriving Impacts • Remember high school algebra? The general formula for a line is: y = mx + b • The basic idea is that x is a “cause” and y is an “effect”, and m represents the slope of the line – summarizing the relationship between x & y • CFI Group uses a sophisticated variation of the advanced statistical tool, Partial Least Squares (PLS) Regression, to determine impacts when many different causes (i.e., quality components) simultaneously effect an outcome (e.g., Customer Satisfaction)

More Related