Benchmarking of library web sites
This presentation is the property of its rightful owner.
Sponsored Links
1 / 32

Benchmarking Of Library Web Sites PowerPoint PPT Presentation


  • 75 Views
  • Uploaded on
  • Presentation posted in: General

Benchmarking Of Library Web Sites. Penny Garrod Public Library Networking Focus UKOLN University of Bath Email [email protected] Brian Kelly UK Web Focus UKOLN University of Bath Email [email protected] UKOLN is supported by:. Contents. Introduction

Download Presentation

Benchmarking Of Library Web Sites

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Benchmarking of library web sites

Benchmarking Of Library Web Sites

Penny Garrod

Public Library Networking Focus

UKOLN

University of Bath

Email

[email protected]

Brian Kelly

UK Web Focus

UKOLN

University of Bath

Email

[email protected]

UKOLN is supported by:


Contents

Contents

  • Introduction

  • Background to Benchmarking at UKOLN

  • Benchmarking UK Public Library Web Sites for Accessibility and Usability

  • Survey Methodologies

  • Limitations of Approach

  • Where to from here?

BK


Ukoln

UKOLN

  • UKOLN:

    • National focus of expertise in digital information management

    • Based at University of Bath

    • Funded by JISC (HE and FE sector) and Resource: The Council for Museums, Archives and Libraries, together with project funding (e.g. EU and JISC)

    • About 27 FTEs

    • Carries out applied research (e.g. in metadata), software development and provides policy and advisory services

BK


Uk web focus

UK Web Focus

  • UK Web Focus:

    • Funded by JISC to provide advice on Web developments to UK Higher & Further Education

  • Public Library Networking Focus:

    • Funded by Resource and JISC to provide advice on networking issues to UK Public Library Sector

  • Synergies

    • The Focus posts will be increasingly working together to maximise benefits to the two sectors and to support the development of community working across these sectors

BK


Webwatch project

WebWatch Project

  • WebWatch project:

    • Initially funded for 1 year in 1997 by BLRIC to develop and use automated robot software to analyse Web developments across various UK communities

    • Once funding finished the work continued, but made use of (mainly) freely available Web services to analyse various features of Web site communities

    • Supports community-building work across UK HE/FE Web managers (sharing, not flaming)

    • See <http://www.ukoln.ac.uk/web-focus/webwatch/>

BK


Webwatch surveys

WebWatch Surveys

  • Search Engines Used To Index UK HE Web Sites:

    • ht://Dig most popular and growing in popularity followed by an MS solution

    • Interest in licensed Ultraseek/Inktomi solution

    • Interest in externally-hosted indexers (e.g. Google)

    • Surprising number of institutions with no search facility

    • See <http://www.ukoln.ac.uk/web-focus/surveys/uk-he-search-engines/>

  • Nos. of Links:

    • Cambridge has most (231,000 links to all servers)

    • Sheffield has the most to a single server (46,000)

    • See <http://www.ariadne.ac.uk/issue23/web-watch/>

  • Nos. Of Web Servers:

    • Cambridge has most (200+)

    • See <http://www.ariadne.ac.uk/issue25/web-watch/>

  • BK


    Update on search engines

    Update On Search Engines

    • Sept 1999

      • ht://Dig: 25 Excite: 19

      • Microsoft: 12Harvest: 8

      • Ultraseek: 7 SWISH: 5

      • Other: 23None: 59

    • Jan 2002:

      • ht://Dig: 48 Microsoft: 17

      • Ultraseek/Inktomi: 12Google: 11

      • Excite: 5Webinator: 5

      • Others: 22 None: 29

    NOTE

    The growth in popularity of ht://Dig, the unexpected appearance of the Google externally-hosted service and the move from SWISH and Harvest would not have been noticed without the snapshots. The discussion of surveys informed decision-making.

    BK


    Benchmarking

    Benchmarking

    • WebWatch approach of monitoring UK HE Web sites can be extended into a benchmarking exercise:

      • Making comparisons with peers

      • Checking compliance with standards

      • Checking compliance with community or funders guidelines (e.g. e-Government guidelines)

    • This has advantages for organisations:

      • Observing best practices and learning from them

      • Ditto for bad practices

      • Community building

    • and some potential disadvantages:

      • Establishment of leagues tables

      • Inappropriate comparisons

      • Penalty clauses for failure to comply with standards

    BK


    Benchmarking library web sites

    Benchmarking Library Web Sites

    • WebWatch approach has been applied to a small number of UK Public Library Web sites:

    • Small selection chosen in order to:

      • Keep resource requirements to a minimum

      • Validate methodology

      • Gauge interest in this approach

    • Survey sample:

      • Focus on Public Library Web sites

      • Survey undertaken in February 2002

    Details of survey available from <http://www.ukoln.ac.uk/web-focus/events/conferences/ili-2002/benchmarking/>

    PG


    Benchmarking public library web sites

    Benchmarking Public Library Web Sites

    • Choosing the sample:

    • Web sites nominated for the EARL* ‘Best on the Web Awards’ competition 1999

      • 16 Public Library websites nominated from across the UK

      • judging criteria for award available from the ‘Wayback Machine’: http://web.archive.org/

      • includes: good web site design and planning; information content; interactive features; Internet resources

    *EARL ceased to operate in Sept 2001

    PG


    Survey methodology

    Survey Methodology

    • Analysis of domain names

    • Analysis of 404 error pages

    • WAVE analysis (accessibility tool)

    • BOBBY (accessibility tool)

    • Analysis of search facilities

    • Small scale survey to compare accessibility of Home Pages plus existence of basic usability functions

    PG


    1 domain names

    1. Domain Names

    • Findings

      • Survey looks at entry points which are the domain name

      • The survey notes that majority of Public Libraries currently use .gov.uk domain

    • Discussion

      • Do the domains have a short, memorable URL?

      • Are a variety of top level domains used that will confuse the end user?

    Note: naming conventions: “local authorities may generally use the format “area.gov.uk” unless there is the possibility of confusion with another authority (e.g. city and county)”

    • From: “Moderning government: framework for information age government websites” at <http://www.e-envoy.gov.uk/publications/guidelines/webguidelines>


    2 404 error page

    2. 404 Error Page

    • Information on the 404 error page will be provided:

    • Findings

      • How many sites use a default 404 error message

      • How many sites use a lightly branded error message,

      • How many sites provide rich functionality?

    • Issues

      • The 404 error page is (sadly) likely to be widely accessed

      • It is desirable that it:

        • Reflects the Web sites ‘look-and-feel’

        • Provides functionality to assist a user who is ‘lost’:

          • Provides access to a search facility / site map

          • Provides contact details

      • The 404 page can also be context-sensitive (e.g. different pages for users following a local link / remote link / no link)

    PG


    3 accessibility

    3. Accessibility

    • Entry points were examined for compliance with W3C WAI (Web Accessibility Initiative) Accessibility Guidelines

    • Web-based tools used:

      [1] the WAVE 2.01

      http://www.temple.edu/inst_disabilities/piat/wave

      • Pennsylvania’s Initiative on Assistive Technology (PIAT)

      • Does not tell you if page is accessible - no tool does this

      • Adds icons and text to page to help you judge if its accessible - use downloadable tutorial

      • Requires exercise of judgment and provides information to help you make that judgment

    PG


    4 accessibility continued

    4. Accessibility continued

    • Web-based tools used:

      [2]Bobby: http://www.cast.org/bobby/

      You need to select the guidelines to use:

      • Web Accessibility Initiative: WAI World Wide Web Consortium's ( W3C) Web Content Accessibility Guidelines

      • Section 508 guidelines developed by the U.S. Federal Government.

        Select1 the WAI option

    PG


    5 search facility

    5.Search Facility

    • Information on search facilities will be provided:

    • Findings

      • Number of sites with a search facility: [68% of sample]

      • Is the search facility working? [2 very slow so gave up; 1 not available at time]

    • Issues

      • user expectations: many head straight for the search facility as they know what they’re looking for

      • It can take < 30 minutes (and little technical expertise) to make an externally hosted search engine available - suitable for simple static Web sites (not many people know this)

    PG


    Evaluating the results

    Evaluating The Results

    • Accessibility issues

      • How many sites have nil WAI Priority 1 errors?

      • Are WAVE and Bobby results consistent - are there glaring differences?

    • Issues

      • Compliance with accessibility standards is important for ensuring access to resources for people with a range of disabilities (e.g. dyslexia)

      • Compliance with accessibility standards may be an organisational requirement and a legal requirement: Disability Discrimination Act 1995 & the Human Rights Act 1998.

      • Compliance benefits everyone - not just those with disabilities - it improves general usability.

      • Meeting the UK Government agenda: delivering e-government; social inclusion; lifelong learning etc.

    PG


    Benchmarking of library web sites

    PG


    Limitations of survey

    Limitations Of Survey

    • Limitations of this type of benchmarking approach include:

      • Lack of standards

      • Limitations of the tools

      • Resources needed to carry out surveys

      • Scoping of library sites and invalid comparisons

      • Automated approach fails to address content issues which require a manual approach

      • results of automated tools (e.g. Bobby/WAVE)

        often require interpretation by humans

    BK


    Limitations standards

    Limitations - Standards

    • There is a lack of standards to support benchmarking work (or conflicting standards). For example:

    • Size of a page

      How do you measure the size of the library’s entry point? You need this in order to make comparisons and if, say, you have guidelines on the maximum file size.

    • Problems

      • What do you measure (HTML file, inline images, external CSS and JavaScript files, …)?

      • Changes in file content (e.g. user-agent negotiation, news content, frames and refresh elements, etc.)

      • How do you handle the robot exclusion protocol (REP)

    NOTE: Bobby and NetMechanic work differently: the former only measure HTML and images, the latter obeys the REP

    BK


    Limitations tools

    Limitations - Tools

    • Definitions:

      • Auditing tools tend to make implicit definitions (e.g. measuring page size). Different results may be obtained if different tools used (or if vendor changes its definition)

    • Use of Web-based auditing services

      • Talk has described use of (mainly free) Web-based services

      • The providers may change their policy

      • Use of the URL interface to pass parameters (rather than direct use of the form on the Web page) may not be allowed

    • Use of desktop auditing tools

      • Use of desktop tools avoids the problems of change control of Web based services

      • It may be difficult for others to reproduce findings

    BK


    Limitations resources

    Limitations - Resources

    • It can be time-consuming to:

      • Maintain URL of entry point to library Web sites (need to have close links with provider of central portal)

      • Manage the input to the variety of Web-based services

      • Process the output from the Web-based services (current need to initiate inquiry, wait for results and manually copy and paste results)

    BK


    Limitations scope of web site

    Limitations – Scope of Web Site

    • Scope

      • What is a Library Web site?

      • What is not part of a Library Web site?

      • It can be difficult to answer these questions.

      • There are no standard ways to define a “Web site” other than by use of domain names and directory structures

      • Even directory structures can be inadequate if they are not used correctly

    • Comparisons

      • It may not to sensible to make comparisons between libraries of different types and sizes

    BK


    Limitations automated only

    Limitations – Automated Only

    • Use of an automated approach:

      • Would not (easily) address content issues

      • Has been supplemented with manual observations (e.g. home page, 404 page & search engine page)

    • However:

      • An automated approach can be more objective and reproducible

      • An automated approach should be less resource-intensive (once software has been set up to maintain links to resources, surveys sites and process results)

      • A automated approach could be used in conjunction with a manual survey (of a representative sample set of resources)

    BK


    Beyond a pilot

    Beyond A Pilot

    • Despite the limitations which have been described, would a comprehensive and systematic benchmark of, say, UK Library Web sites be of benefit?

      • Can we address the resource issues?

      • Are the lack of standards being addressed?

      • Can we find someone to do the work?

      • Should the focus be developmental?

      • Can the work be extended to provide notification of problems (e.g. search engine not working)?

    What may happen if we don’t do this?

    Might we find that funders set up inappropriate or flawed performance indicators?

    BK


    A model for implementation

    A Model For Implementation

    • The benchmarking process can be made less time-consuming if a more flexible model for managing the data was used

    At present we seem to have a HTML page with links to library Web sites

    Unfortunately HTML pages are difficult to repurpose

    A better model is to store links in a neutral databases, and to generate pages for viewing by end users and for input into benchmarking Web services

    The database could also be reused for other purposes e.g. checking links and email notifications of problems

    Page for inputto Web services

    Page for viewing

    BK


    Towards web services

    Towards “Web Services”

    • Background

      • Web initially implemented for provision of information

      • CGI allowed users to input data and provided integration with backend applications

      • Techniques described use URL as input to auditing service. However this provides limited functionality and is susceptible to vagaries of marketplace

    • Future

      • “Web Services” will support machine integration by providing a standard messaging infrastructure which uses HTTP protocol

      • XML output (e.g. EARL) will provide a neutral format for benchmarking output, and can describe benchmarking environment (EARL is RDF)

    BK


    Need for standard definitions

    Need For Standard Definitions

    • Need For Standard Definitions

      • There is a need for standard definitions of terminology such as Web page, visit, unique visit, session, etc. in order to ensure that meaningful and objective comparisons can be made

      • The market place is addressing current deficiencies within Web Advertising and Web Auditing communities (and there are financial incentives for this to be solved)

      • With the growth in e-governments internationally and governments setting targets (X% of government work to be carried about electronically by 2005)

    BK


    Doing the work

    Doing The Work

    • If there is further interest, who should do the work?

    Project

    partners

    Who?

    Researcher

    Why?

    Funding body

    Student project

    current/new remit

    Auditing body

    Other(s)

    Single Regional Agency

    Research interest

    BenchmarkingWork

    Dissemination

    What?

    benefits community

    Maintain central database

    Best Value - Performance Indicators e.g. BV157 - electronic interactions

    Software development

    Producing reports

    PG


    What next

    What Next?

    • To summarise:

      • Approach to the automated benchmarking of a small set of Public Library Web sites has been shown

      • Implications of the findings have been discussed

      • There are limitations of the methodology

    • It is suggested that:

      • Despite the limitations of benchmarking the approach can aid:

        • Community building

        • Learning from successes and mistakes

        • Performance Measurement/Best Value Review

      • Are there advantages in carrying out this work on a regional/local basis/with existing partners basis?

    PG


    Questions

    Questions

    • Any questions?

    PG


    Useful resources

    Useful resources

    • How people with Disabilities Use the Web: W3C working draft, 4 January 2001 (Human Computer Interaction) http://www.w3.org/WAI/EO/Drafts/PWD-Use-Web/20010104.html

    • Bobby: http://www.cast.org/bobby/

    • WAVE: http://www.temple.edu/inst_disabilities/piat/wave/

    PG


  • Login