1 / 37

Now You've Built It...Does It Work?

Now You've Built It...Does It Work?. Susan Fariss, National Library of Medicine Web Management Team Cindy Love, National Library of Medicine Specialized Information Services. National Library of Medicine. World’s largest medical library One of the Institutes of NIH

sorley
Download Presentation

Now You've Built It...Does It Work?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Now You've Built It...Does It Work? Susan Fariss, National Library of Medicine Web Management Team Cindy Love, National Library of Medicine Specialized Information Services

  2. National Library of Medicine • World’s largest medical library • One of the Institutes of NIH • Over 20 major Web resources

  3. NLM Web sites using ACSI

  4. How it all began • FY 2004 - NIH evaluation set-aside funds awarded to survey 5 NLM sites using the ACSI • FY 2005 - We’re going trans-NIH with 60 NIH Web sites to be surveyed

  5. Why is the MedlinePlus score so high? • NLM evaluates its Web sites a lot! • Content meets very high selection standards • Look, feel, navigation, and search are consistent • Site is constantly updated and maintained AND • We believe people are grateful for quality health info

  6. Why NLM studies its Web sites To better understand our users and our position in the health information arena • Why do users come to NLM Web sites? • What do they do with the info they find? • How satisfied and likely to return are they? • How can we do better?

  7. Multi-dimensional approach • Multi-pronged approach, using several evaluation methods • All methods have relative strengths and limitations, still both art and science

  8. Types of evaluation • Focus groups • Heuristic review • Usability lab testing • Randomized online user surveys • Online “external” user panel surveys • Internet audience measurement • Unsolicited user feedback • Web log data analysis • Search log analysis

  9. Focus groups • Small group of users (typically 6 to 10) provides feedback about a Web site, with a moderator following a prepared script of queries about the site. • Session lasts 45 minutes to an hour and can be in-person or via online chat or telephone. Typical cost. $10,000 in-person focus group; $5,000 online.

  10. When and why • During site development or after release • Assess the need for new or modified features and functions • Evaluate the impact of Web site usage on user knowledge and behavior

  11. Heuristic review • A Web usability expert reviews your Web site, compares it against generally accepted Web design and functionality principles and standards and suggests design improvements. • May include site layout and structure, navigation tools, search function, and fonts and colors. Typical cost. $5,000 to $10,000.

  12. When and why • During development and to improve site • Improve layout, navigation, and search functions (usability) • Assure compliance with Web usability and accessibility best practices

  13. Usability lab testing • Structured testing of Web site. • Users perform a series of tasks using the Web site, and test facilitators monitor and record user behavior and cognitive processes (think-aloud test). Typical cost. $10,000 to $25,000

  14. When and why • During development and to improve site • Improve layout, navigation, and search functions (usability) • Gauge the success and relative importance of new features and functions • Assess the need for new or modified features and functions

  15. Randomized online user surveys • A random selection of Web site users have the opportunity to respond to a pop-up survey of 5 to 20 questions when they visit the site. • Collect sample of 2,000 to 3,000 in a year. Typical cost. $20,000 to $30,000

  16. When and why • Improve site • Gauge the success and relative importance of new features and functions • Assess the need for new or modified features and functions • Identify under-served or under-represented user groups • Evaluate the impact of outreach and promotional activities • Compare Web site usage and impact against agency mission and performance goals • Evaluate the impact of Web site usage on user knowledge and behavior • Diagnose and improve technical performance and download times

  17. Internet audience measurement • Private companies collect usage data from large panels of Web users whose Web surfing is monitored constantly. Panel sizes range from 50,000 to 1.5 million and cover usage in US homes and, in some cases, office, school, and international sites. Companies extrapolate usage data from demographics and census data. Provides comparative data on usage of your Web site versus competitive Web sites in a defined market. • Data collection and extrapolation methods vary by company. Differences in panel composition and methodologies mean that usage data from different companies are not strictly comparable. Should consider these results as estimates, not precise measures. Typical cost. $35,000 to $40,000 per year.

  18. When and why • Ongoing • Gauge the success and relative importance of new features and functions • Assess the need for new or modified features and functions • Identify under-served or under-represented user groups • Evaluate the impact of outreach and promotional activities • Compare Web site usage and impact against agency mission and performance goals • Evaluate the impact of Web site usage on user knowledge and behavior • Diagnose and improve technical performance and download times • NLM has used comScore/MediaMetrix and Nielsen/NetRatings private Internet user panels

  19. NIH.gov in the health information arena Top 5 general health information web sites: • WebMD - 3 Million unique visitors • NIH.gov - 2.7 M (inclusive of NLM) • AOL Health (powered by WebMD) - 1.8 M • MSN Health w/ WebMD - 1.6M • Yahoo! Health - 1.3 M Based on Nielsen/NetRatings Sept. 2003 US Home market, domain reporting

  20. Unsolicited user feedback • User feedback via an e-mail box, a link on the homepage, or a help desk phone number. • Categorize data and analyze. Typical cost. Staff time, which can be extensive.

  21. When and why • Ongoing • Fill needs of intended audience more effectively • Gauge the success and relative importance of new features and functions • Assess the need for new or modified features and functions • Identify under-served or under-represented user groups • Evaluate the impact of outreach and promotional activities • Compare Web site usage and impact against agency mission and performance goals • Evaluate the impact of Web site usage on user knowledge and behavior • Diagnose and improve technical performance and download times

  22. Web log analysis • Analyze Web usage and search log data gathered by Web log analysis software installed on the Web site server to collect usage data. Typical cost. $500 to $100,000 for software site license

  23. When and why • Ongoing • Gauge the success and relative importance of new features and functions • Evaluate the impact of outreach and promotional activities • Diagnose and improve technical performance and download times

  24. History of online user surveys at NLM • First generation - posted surveys, non-random, no benchmarking • Second generation - snap shot surveys at single point in time, randomized sample, 2,000 to 4,000 respondents, custom surveys, not able to benchmark • Third generation – ACSI

  25. Why ACSI? • Ability to survey year round to track how changes to the site or other factors affect satisfaction scores • Ability to benchmark using model questions

  26. ACSI as a part of the big picture • Use data to focus on areas needing improvement • Choose appropriate evaluation methods to improve these areas, such as usability testing

  27. Contact us: • Susan Fariss • 301-435-7102 • susan_fariss@nlm.nih.gov • Cindy Love • 301-496-5306 • cindy_love@nlm.nih.gov Reference: Wood, Siegel, LaCroix, Lyon, Benson, Cid, and Fariss. A Practical Approach to E-Government Web Evaluation. IT Pro: May/June 2003, p. 22-8.

  28. NLM Web Sites

  29. http://toxnet.nlm.nih.gov

  30. http://tox.nlm.nih.gov

  31. New look?

  32. http://aidsinfo.nlm.nih.gov

  33. New look?

  34. Previous look

  35. http://www.nlm.nih.gov/

  36. http://medlineplus.gov/

  37. http://medlineplus.gov/spanish/

More Related