1 / 44

Benchmarking Your Web Site

UKOLN is supported by:. Benchmarking Your Web Site. Brian Kelly UKOLN University of Bath Bath, BA2 7AY. Email B.Kelly@ukoln.ac.uk URL http://www.ukoln.ac.uk/. Contents. Introduction Why Benchmark? What Is Benchmarking? Tools For Testing Web Sites From Testing To Benchmarking

oshin
Download Presentation

Benchmarking Your Web Site

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UKOLN is supported by: Benchmarking Your Web Site Brian Kelly UKOLN University of Bath Bath, BA2 7AY Email B.Kelly@ukoln.ac.uk URL http://www.ukoln.ac.uk/

  2. Contents • Introduction • Why Benchmark? • What Is Benchmarking? • Tools For Testing Web Sites • From Testing To Benchmarking • Case Studies: WebWatch Surveys Of UK University Sector • Commercial Approaches • Discussion

  3. What Do You Think Benchmarking is? • Could you turn to your neighbour and ask: • What do you think benchmarking is? • How you think benchmarking can help you?

  4. Benchmarking: A Definition • Benchmarking is about identifying and measuring best practice processes that work elsewhere and then emulating them. • The aim is to reduce duplication by learning from others who have already found the solution. • It is about: • Understanding your weaknesses • Comparison with your peers • Note that best practices are constantly evolving.

  5. Aims Of This Talk • By the end of the session you should: • Be able to benchmark your Web site in relation to other sites in your community • Have seen examples of use of auditing and evaluating tools • Have considered other types of benchmarking activity available • Be in a position to decide whether to adopt this methodology in your organisation

  6. Approaches To Benchmarking • There are a number of approaches to benchmarking of Web sites: • Manual Benchmarking • Use of manual techniques such as questionnaires, usability studies, etc. • Automated Benchmarking • Automated benchmarking makes use of software tools to support benchmarking • Approaches can include: • Dedicated benchmarking software products, typically running on desktop PC • Use of benchmarking services available on the Web

  7. Benchmarking Approaches • Manual: • End users involved in process • Can receive feedback which cannot be obtained using software • Can be time-consuming and expensive • Automated: • Can be less expensive • Can scale to many thousands of resources • Restricted to aspects which software can process This talk deals mainly with automated benchmarking, using Web-based tools

  8. Web Testing Services • Many Web-based services are available for reporting on various aspects of Web sites, including: • HTML compliance • Browser compatibility • Broken Links • Load Time • Accessibility • Link popularity • …

  9. Example - NetMechanic • NetMechanic: • A Web-based service for checking Web sites • Various functions available for free • Additional functions such as more comprehensive testing are licensed http://www.netmechanic.com/

  10. Example – Doctor HTML http://www2.imagiware.com/RxHTML/ • Doctor HTML: • A Web page analysis tool • Free for testing individual pages • A licensed version can be installed locally for checking entire Web sites

  11. Example - Bobby • Bobby: • A Web-based accessibility checker • Can test individual pages • A licensed downloadable version can check entire Web sites http://bobby.watchfire.com/

  12. Example – Link Popularity • Link popularity: • An indication of the popularity of a Web site • Can be obtained by analysing search engines such as AltaVista and Google http://www.linkpopularity.com/

  13. Example – Server Analysis • Netcraft’s server analysis provides details of: • Web server software • Operating system environment • Server availability (limited) • Nos. of servers in domain http://www.netcraft.com/

  14. Example – Server Analysis • The University of Dundee provide a HTTP analysis tool: • Analyses HTTP headers http://www.somis.dundee.ac.uk/general/wizards/fetchhead.html

  15. Usability http://zing.ncsl.nist.gov/WebTools/WebSAT/operation.html • Usability normally requires manual testing • However automated support tools are also available such as WebSAT

  16. From Testing To Benchmarking • These tools are typically used for testing individual pages on one’s own Web site • However applying tools across other Web sites allows: • Comparisons to be made with competitors and collaborators • Examples of best practices to be found and lessons learnt from • Examples of problems to be found and mistakes avoided • Trends to be monitored by repeat surveys • Limitations of tools to be found by testing across wide range of Web site environments

  17. UK HE Case Studies • Every 3 months a WebWatch survey is published in the Ariadne e-journal: • Surveys include: • Accessibility of entry points • Nos. of Web servers • Nos. of links to organisation • Size of entry points • Web server software • Relationships • Together with manual surveys of: • Search engine software • 404 error pages • See: <http://www.ukoln.ac.uk/web-focus/webwatch/articles/#latest> Case Studies

  18. Accessibility • In September 2002 Bobby was used to analyse the entry points the UK University Web sites. Case Studies http://www.ariadne.ac.uk/issue33/web-watch/ • Findings • Only 4 pages appeared to comply with Bobby AA guidelines • Further analysis revealed that only 3 complied

  19. Size Of Home Page http://www.ariadne.ac.uk/issue28/web-watch/ • The size of UK University entry points was analysed in 1998 and repeated in June 2001 Case Studies The reasons for the large entry points was reviewed.

  20. Numbers Of Web Servers • A survey of the numbers of Web servers was carried out in 2000 and repeated in 2002 Case Studies http://www.ariadne.ac.uk/issue31/web-watch/ Most institutions have a small number of Web servers but a few have over 100

  21. http://www.ariadne.ac.uk/issue27/web-watch/ What’s Related? • Netscape’s What’s Related tool was used to record: • Popularity • Nos. of pages indexed • Nos. of links to site

  22. Links To University Web Sites http://www.ariadne.ac.uk/issue23/web-watch/ • A survey of the number of links to UK University Web sites was published in 2000 • The survey used the AltaVista and Infoseek search engines

  23. Search Engines • A manual survey of search engines used on UK University Web sites was carried out in 1997 and has been repeated every 6 months in order to monitor trends http://www.ariadne.ac.uk/issue30/web-watch/

  24. 404 Error Pages • A (manual ) survey of 404 error pages on UK University Web sites was carried out in 1999 and repeated in 2002. Case Studies The original survey and article helped to raise the importance of well-designed 404 pages as an important navigation feature WebWatch: 404s - What's Missing? June 1999<http://www.ariadne.ac.uk/issue20/404/> Revisiting 404 Error Pages In UK University Web Sites, June 2002, <http://www.ariadne.ac.uk/issue32/web-watch/>

  25. Apache/1.3.6 Server at www.shef.ac.uk Port 80 404 Error Pages • Significant changes have been made since the findings of the first survey was published

  26. Limitations Of This Approach Limitations • What limitations do you think this approach may have?

  27. Limitations Can’t handle Intranets Limitations Reliance on third party tools Inconsistencies across tools Unusual aspects of Web sites Inadequacies of automated tools Limitations Personalision, cookies, etc. Performance implications Legal and ethical issues

  28. Reliance On Third Party Tools Limitations • This approach relies on use of third party software: • Company may go out of business • Company may introduce charging or change conditions of use • Companies may change format of its output • Company may change algorithms (possibly without notification) • Example: • The Bobby accessibility checker withdrew a summary of the file size of resources. • CAST sold Bobby: the new company introduced limitations to use of Bobby

  29. Intranets, etc. Limitations • Use of public third party Web sites for testing: • May not work with Intranets or Web sites which require a username and password to access • Possible Solution • Some testing services allow you to give a username and password • If you do this, you are trusting the service not to steal the username and password!

  30. Inconsistencies Limitations • Different tools may have inconsistent approaches • Example • NetMechanic and Bobby (previous version) reported on the file size of analysed pages. • Bobby only analysed the HTML page and inline images. • NetMechanicalso included external JavaScript and stylesheet files. • NetMechanic respected the Standard for Robot Exclusion (SRE) and would not analyse images if the SRE banned access. • Bobby ignored the SRE.

  31. Accessibility Testing Automated tools such as Bobby can report on missing ALT tags However automated tools cannot report that a meaningful ALT tag is given <img src=“important-graph”> <img src=“important-graph”> alt=“”> Inadequacies Limitations • Automated tools: • Are not suitable for testing all aspects of a Web site • Need to be complemented by manual testing • Reliance on automated results without warning notices can cause confusion

  32. Performance Implications Limitations • Automated tools: • Could degrade the performance of Web sites if poorly designed Case Study A HTML validation tool was used to check A Web site. Shortly after it was used, it was found that it repeatedly sent HTTP requests to a Web site, which felt that this was a denial-of-service attack.

  33. Legal And Ethical Issues Limitations • If your survey findings: • Give a negative impression of a Web site • Are flawed, and give an mistaken negative impression of a Web site • could you be sued? Case Study WebWatch surveys seek to highlight examples of best practices. Care is taken in the language used when problems are reported.

  34. Personalisation Limitations • How should testing tools behave if Web sites provide personalised interfaces: • Make use of username details to personalise context • Personalise the interface based on the user’s browser • Personalise the interface based on other environment factors (e.g. time, referer page, language setting, etc.)

  35. Unusual Aspects Limitations • How should testing tools deal with other unusual aspects of a Web site such as: • Web page redirects  Splash screens • Pop-up windows  Frames • etc. Example When you give a URL the Web page is redirected to another URL. The new page displays a splash screen for 5 seconds and then moves to a new page with contains frames. In addition a pop-up window is displayed. How many pages are there?

  36. Benchmarking And QA • The benchmarking approach may be used to: • Ensure that Web sites comply with standards and best practices • May be of interest to funding bodies • UKOLN involved in work in this area to ensure that projects comply with standards and best practices in order that they will be interoperable See <http://www.ukoln.ac.uk/qa-focus/surveys/>

  37. Who Else Is Doing This? • Who else may be carrying out benchmarking surveys? • We will use a Google search for: • “accessibility surveys”, “Web site benchmark”, “HTML compliance surveys”, etc. • In order to explore other approaches including: • Commercial approaches • Non-commercial approaches

  38. US Accessibility Surveys http://library.uwsp.edu/aschmetz/Accessible/websurveys.htm • Axel Schmetzke has carried out surveys of the accessibility of selected US University Web sites

  39. Try It For Yourself • The methodology which has been described can be used by yourself across your community • Benefits: • You will get an idea of how your compare with your peers • For national bodies, funders, etc. you can gain a profile of your community • There may be opportunities for describing your community at conferences, etc.

  40. Implementing A Benchmark Survey • To implement your own benchmark across a community you can simply examine WebWatch articles and adapt the HTML for your own use. • Further details at <http://www.ariadne.ac.uk/issue29/web-watch/> • Technique Used • Use the Web service on a site • Copy URL into template • Determine URL structure • Use as basis for use with other URLs http://bobby.cast.org/bobby/bobbyServlet? URL=http%3A%2F%2Fwww2.brent.gov.uk%2F&output=Submit&gl=wcag1-aaa

  41. Implementation (2) • Simple technique: • Copy URLs into a template • Better technique: • Create HTML file using a server-side script • Better technique: • Use a backend database so resources can be more easily managed <a href=“tool?url”>Try it</a> … <!-- query_string=http://…/tool.cgi?URL=$website --> Do for all websites <a href=“query_string?$website”>Try it</a> …

  42. Next Generation Tools • We can expect to see further development in testing tools: • Why? • Compliance with, say, e-Government guidelines • Ensure Web sites work e.g. e-commerce • How: • Tools which provide richer functionality (e.g. dealing with personalised Web sites) • Development of “Web services” for testing • Agreement on standards e.g. what is a Web “page” • Development of XML standards for interchange of results e.g. EARL

  43. Resources For You To Use • A series of exercises on Web site benchmarking is available, which contains details of a number of benchmarking tools http://www.ukoln.ac.uk/web-focus/events/conferences/ucisa-tlig-2002/benchmarking/

  44. Questions • Any questions?

More Related