1 / 16

U ser needs Vipavc Brvar Irena , ADP, Slovenia IASSIST/IFDO 2005 , 24 - 27 May , 200 5

U ser needs Vipavc Brvar Irena , ADP, Slovenia IASSIST/IFDO 2005 , 24 - 27 May , 200 5. Purpose – EVALUATION The most general definition of evaluation is that it is a process of assessment of the value of a certain activity or objects .

Download Presentation

U ser needs Vipavc Brvar Irena , ADP, Slovenia IASSIST/IFDO 2005 , 24 - 27 May , 200 5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. User needs Vipavc Brvar Irena, ADP, SloveniaIASSIST/IFDO 2005, 24 - 27 May, 2005

  2. Purpose – EVALUATION The most general definition of evaluation is that it is a process of assessment of the value of a certain activity or objects. In the case of libraries they evaluate how well the library satisfy the needs of its users and basic limitations and weaknesses in the library’s operation. The standard ISO 11620 (1998) defines evaluation as a process of assessing effectiveness, efficiency, usefulness and relevancy of a certain services, equipment or facilities.

  3. Lancaster (1977) states that services (in our case also data archives services), can be evaluated from different viewpoints and that in the process we find: • How well a service realises the goals set, which usually means how well it satisfies the needs of the users. • How efficiently a service, system realises the goals set (operating costs). • Does the data archive justify its existence, that is what is the usefulness of the data archive service in light of operating costs (measuring cost-performance-benefits) Evaluation processes are important for decision making.

  4. Evaluation is a part of strategic planning process and has practical significance. It is in some cases also necessary because of political reasons (to justify the expenditure of public funding and to show sense in its spending). - > The future of data archives does not lie in supporting generic, broadly useful services such as information access to large collections, but in supporting “customization by community”, i.e. the development of services tailored to support the specific, and real, practices of different user constituencies.

  5. In data archive area we do not have indicators that would measure our performance nor do we have standard questionnaire/questions that would measure satisfaction of our users with our services. There were some attempts to measure user satisfactions (UK DA, ICPSR, University of Tennessee, etc.). -> Changes in IT brought to us new expectations from users. We experience change of practice in data distribution and collection, change of behaviour of users, and usability. Users want codebooks and data files to be easy available.

  6. Decision was made to conduct a survey and evaluate users needs and their satisfaction with offered services. We do want responses from different groups of users (researchers, users and potential users) Banner on Faculties Web pages Registered Slovene researchers in the field of social science. Known users- in our user database Cover letter by email

  7. How to measure usability • Analytically: counter – log analyzer, tracing paths, Web site usability evaluation, etc. • Empirically: survey...

  8. Information from Web statistics (log analysis): * Number of visits, and number of unique visitors,* Visits duration and last visits,* Days of week and rush hours,* Domains/countries of hosts visitors (pages, hits, KB, 269 domains/countries detected, GeoIp detection),* Hosts list, last visits and unresolved IP addresses list,* Most viewed, entry and exit pages,* OS used,* Browsers used ,* Search engines, keyphrases and keywords used to find your site,* Number of times your site is "added to favourites bookmarks".* Screen size* ......

  9. Analysis of log file from Nesstar • Analysing user paths Colleting Study. Info. event news Particular study EN ADP List of studies SLO Price list Mail-contact depositors Order form

  10. Survey on user satisfaction • Pretest (current stage) • Define Sampling Frame and Contact Design • Decide on data collecting method Web based survey Create separate URL addresses for each email address given. Demands for anonymity are satisfied.

  11. Web survey with PHPSurveyor PHPSurveyor is a multi-question surveying tool that allows you to develop, publish and manage surveys. PHPSurveyor script allows you to display surveys as single questions, group by group or all in one page or use a dataentry system for administration of paper-based versions of the survey. PHP Surveyor can produce "branching" surveys (set conditions on whether individual questions will display), can vary the look and feel of your survey through a templating system, and can provide basic statistical analysis of your survey results. -Conversion to SPSS data file was written.

  12. Most of current user satisfaction surveys ask users to tell them how much they are satisfied and how important particular mentioned topic is(5 point scale) . To shorten the questionnaire we decided to do it a little bit different. We inquire about overall satisfaction (on general, and about web page) and to evaluate particular services. Discriminant analysis!! -> create indexes that measure satisfaction and importance

  13. Relation between satisfaction with and importance of particular element Satisfaction Importance This need to be changed!!!!

  14. Questionnaire available at http://www.adp.fdv.uni-lj.si/publikacije/userneeds.html

  15. Literature • AMBROŽIČ, Melita (2003): A few countries measure impact and outcomes – most want to measure at least something. In: Performance Measurement and metrics. Vol.4, No.2, p. 64-78. • BARNES, Stuart J. And VIDGEN, Richard (2001): An Evaluation of Cyber-Bookshops: The WebQual Method. In: International Journal of Electronic Commerce. Fall 2001, Vol.6, No.1., pp.11-30. • BOLLEN, Johan and LUCE, Rick (2002): Evaluation of Digital Library Impact and User Communities by Analysis of Usage Patterns. In: D-Lib Magazine. Vol. 8, No.6. • COLEMAN, Anita and SUMNER, Tamara (2004): Digital Libraries and User Nees: Negotiating the Future. In: Journal of Digital Information. Vol 5, Issue 3. • HERNON, Peter and ALTMAN, Ellen (1998): Assessing service quality: satisfying the expectations of library customers. Chicago, London: American Library Association. • NORLIN, Elaina (2002): Usability testing for library Web Sites. A hands-on guide. Chicago, London: American Library Association. • RUBIN, Jeffrey (1994): Handbook of usability testing.How to plan, design, and conduct effective tests. John Wiley & Sons, USA.

More Related