1 / 19

BibEval – A framework for usability evaluations of online library services

BibEval – A framework for usability evaluations of online library services. Thomas Weinhold , Bernard Bekavac, Sonja Hamann * Swiss Institute for Information Research (SII), Namics AG * Libraries in the Digital Age (LIDA) 2014 16-20 June 2014, Zadar (Croatia).

pattison
Download Presentation

BibEval – A framework for usability evaluations of online library services

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BibEval – A framework forusability evaluations of online library services Thomas Weinhold, Bernard Bekavac, Sonja Hamann* Swiss Institute for Information Research (SII), Namics AG* Libraries in the Digital Age (LIDA) 2014 16-20 June 2014, Zadar (Croatia)

  2. e-lib.ch – Swiss electronic library e-rara.ch Marketing e-lib.ch DOI desk Kartenportal.ch Web portal e-lib.ch Metadataservers swissbib Best-Practices E-Depot retro.seals.ch Basel Infoclio.ch Long-term preservation Bern Zürich e-codices ElibEval Chur Fribourg Infonet Economy Genève Information literacy ACCEPT Martigny Search skills RODIN Multivio • Innovation and cooperation initiative with 20 sub-projects • Vision: creation of a national portal to improve access and retrieval of scientific information (www.e-lib.ch)

  3. Sub-project "ElibEval" • Usability evaluations of web sites and applications developed in the context of e-lib.ch • Conception and creation of analytical tools to support information providers in carrying out their own evaluations

  4. Situation of libraries • Changes in environment and increasing competition • Mission: "[..] encourage the library and information sector to work with partners and users to maximize the potential of digital technology to deliver services that enable seamless and open access by users to cultural and information resources." (IFLA Strategic Plan 2010-15, http://www.ifla.org/files/hq/gb/strategic-plan/2010-2015.pdf) Offer the same ease of use, robustness and performance as internet search engines combined with the quality, trust and relevance traditionally associated with libraries

  5. Challengesforlibraries Libraries Libraries Physicalcollection Databases Printer Support Copier Staff Presentation Accommodation/Premises Digitalcollection Website Website Books/Journals Support Databases Indexing Library catalogue Loan Download Archiving Additionalinformation Management • Merging of heterogeneous information • Organizing interaction of various systems, so that users can pursue their objectives without hindrance ("don't burden users with library-interna")

  6. User-perceived quality of library online services (Tsakonas & Papatheodorou, 2006) “The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.” (ISO 9241-Part 11: Guidance on usability)

  7. Usability evaluation methods • Two main criteria to categorize usability evaluation methods: • When(formative vs. summative evaluation) • Who(user-oriented/empirical vs. expert-oriented/analytical methods)

  8. Usability evaluation of online library services • As in our own project, libraries generally use a wide spectrum of methods for usability evaluations • Kupersmith (2012) provides a good overview • According to this literature review the most commonly used method is user observation / usability tests Observation of real user behaviour Time-consuming and expensive Heuristic evaluation is a widely used instrument (cheaper, quicker)

  9. Heuristic evaluation • Visibilityofsystemstatus • Match betweensystemand real world • User controlandfreedom • Consistencyandstandards • Error prevention • Recognition rather than recall • Flexibility and efficiency of use • Aesthetic and minimalist design • Help users recognize, diagnose and recover from errors • Help and documentation • Experts examine whether an interface is compliant with established usability principles (the "heuristics") • Nielsen'sheuristics (1994): (http://www.nngroup.com/articles/ten-usability-heuristics/)

  10. Motivation for the development of library specific heuristics • Most studieslimitthemselvestocommonheuristics, e.g. Nielsen’s 10 heuristics • Lack oflibraryspecificrecommendations(e.g. Clyde 1996, Clausen 1999, Raward 2001) • Problems ofcommonheuristics: • too generic for an in-depth analysis • extensive knowledge in the field of user interface design is needed • Our Goal: Developmoredetailedheuristics, which • aresuitedtothespecificrequirementsoflibrarywebsites • are easy touseandallow a judgementevento non-experts • assistdevelopers in building user-friendlylibrarywebsites

  11. Methodical approach • Three cornerstones: • literature review “usability evaluations in libraries” • best-practice-analysis of library websites • focus group (to discuss and further refine our concept) • Result: • modular, hierarchically structured list of evaluation criteria • all website elements and evaluation criteria classified into mandatory and optional This concept aims at maximizing the applicability of the heuristics for libraries of different size and type.

  12. BibEval – Structure information & communication search & explorethecollection(s) personalization & customization userparticipation search & exploration Suchen & Erkunden simple search advancedsearch presentation & access • 4 sectors divided into sub sectors • Different components in each sub sector • Questions/evaluation criteria for each hierarchy level

  13. Usage of BibEval – Selection of sectors and components http://www.cheval-lab.ch/en/usability-of-library-online-services/criteria-catalogue-bibeval/

  14. BibEval – Severity rating and comments

  15. BibEval – Reports / Export functions

  16. Bibeval – Project aDministration

  17. Project Administration

  18. Conclusions and further work • One criticism levelled against heuristic evaluation: • in-depth knowledge of HCI is required in order to apply this method correctly (Blandford et al. 2007; Warren 2001) • In formulating our evaluation criteria, we focused on end-user perspectives • Continuous improvement of our criteria catalog to keep it up to date • Extension of our web application (e.g. deleting questions, add own criteria) • Enable libraries to conduct self-evaluations • Further development through community

  19. Your Questions? thomas.weinhold@htwchur.ch bernard.bekavac@htwchur.ch sonja.hamann@namics.com http://www.cheval-lab.ch http://www.e-lib.ch

More Related