1 / 30

The Library Impact Data Project: hit, miss or maybe

9th Northumbria International Conference 22-25 August 2011 University of York. The Library Impact Data Project: hit, miss or maybe. Graham Stone Information Resources Manager. http://eprints.hud.ac.uk/10210/ #lidp #jiscad.

vpickard
Download Presentation

The Library Impact Data Project: hit, miss or maybe

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 9th Northumbria International Conference 22-25 August 2011 University of York The Library Impact Data Project: hit, miss or maybe Graham Stone Information Resources Manager http://eprints.hud.ac.uk/10210/ #lidp #jiscad This work is licensed under a Creative Commons Attribution 3.0 Unported License

  2. …to improve existing services …to gain insights into user behaviour …to measure the impact of the library do the students who use the library the most get the highest grades? Using Usage Data since 2005…

  3. Discussing Open DataJISC TILE Project meeting (Jun 2008) 3

  4. Non/Low Use Projectdigging deeper into data 4

  5. Measuring Library Impact2008/9 honours graduates Analysis of the results consistently revealed a correlation between e-resource use, book borrowing and student attainment This appears to be the case across all disciplines

  6. Results • Not a cause and effect relationship • Never proven statistically significant • Potential for collaboration on future projects http://www.flickr.com/photos/atoach/3344411469/

  7. JISC Activity Data Call • Obtained funding from the JISC Activity Data Call • 6 month project (Feb-Jul 2011)

  8. User activity data record of a user’s actions on a website or software system or other relevant institutional service Attention data record of what a user has viewed on a website or software system or other relevant institutional service http://bit.ly/g6S5wH Defining Using UsageJISC Activity Data Programme

  9. Library Impact Data Project

  10. To prove the hypothesis that… “There is a statistically significant correlation across a number of universities between library activity data and student attainment”

  11. Data requirements • For each student who graduated in a given year, the following data was required: • Final grade achieved • Number of books borrowed • Number of times e-resources were accessed • Number of times each student entered the library, e.g. via a turnstile system that requires identity card access • School/Faculty

  12. Legal issues • Consultation with JISC Legal, University legal officer and data protection officer • Ensured that any identifying information is excluded before it is handled for analysis • Excluded any small courses to prevent identification of individuals e.g. where a course has less than 35 students and/or fewer than 5 of a specific degree level • Received guidance from the Using OpenURL Activity Data

  13. Data issues • Anticipated that there may be problems in getting enough data to make the project viable • Potential partners were asked to confirm that they could provide at least 2 of the 3 measures of usage as well as student grades • Huddersfield has provided definitions on the data required and the form the data can be accepted in

  14. Library Impact Data Projectbook loans (2009/10)

  15. Library Impact Data Projectbook loans & Athens (2009/10)

  16. Library Impact Data Projectlibrary PC logins & visits (2009/10)

  17. Can we prove the hypothesis?Hit, miss or maybe? • Due to the data not being continuous, a correlation cannot be calculated http://www.flickr.com/photos/jepoirrier/2043728206/

  18. Further statistical tests (1) • Running a Kruskal-Wallis test • to indicate whether there is a difference between values e.g. between levels of e-resource usage across degree results • THEN we analyse the data visually to check which variables to compare

  19. Further statistical tests (2) Running a the Mann-Whitney U test to see whether there is a significant difference between variables tested Matching these findings with previous tests using ANOVA and the Student T test

  20. Can we prove the hypothesis?Hit, miss or maybe? • That the relationship and variance means that you can believe what you see • And you can believe it across a range of data, e.g. subjects • So library usage does impact on students attainment Not a cause and effect relationship http://www.flickr.com/photos/biggaypat/10431691/

  21. Linking back to non/low usage • Our research shows that for books and e-resource usage, there appears to be a statistical significance across all partner libraries • If we know that there is a link between usage and attainment • We can link this back to non/low usage

  22. Measuring Library Impact2008/9 – library visits 15.5% of students who gaineda 1st never visited the library 34% of students who gaineda 3rd never visited the library 22

  23. Measuring Library Impact2008/9 – MetaLib usage 10.5% of students who gained a1st logged in more than 180 times 70% of those who gained a3rd logged in to e-resources20 times or less over 3 years 23

  24. Measuring Library Impact2008/9 – book loans 15% of students who gaineda 1st never borrowed a book 34% of students who gaineda 3rd never borrowed a book 24

  25. Profiling non/low users • Flesh out themes from the focus groups • to advise on areas to work on • Check the amount and type of contact subject teams have had with the specific courses • to compare library teaching hours to attainment • Baseline questionnaire or exercise for new students • To establish the level of information literacy skills for new students • Target our users by concentrating staff resources at the right point

  26. Focus Group coding

  27. Further work Gender, socio-economic background and nationality? What types of data are needed by library directors, e.g. for the scenario ‘if budget cuts result in less resources, does attainment fall’? We are keen to see if we add value as a library through better use of resources…

  28. Scenarios • Does a student who comes in with high grades leave with high grades? If so why? What do they use that makes them so successful? • What if a student comes in with lower grades but achieves a higher grade on graduation after using library resources? What did they do to show this improvement? • Quite often students who look to be heading for a 2nd drop to a 3rd in the final part of their course, why is this so? • What about high achievers that don’t use our resources? What are they doing in order to be successful and should we be adopting what they do in our resources/literacy skills sessions?

  29. Acknowledgements Dave Pattern and Bryony Ramsden Phil Adams, Leo Appleton, Iain Baird, Polly Dawes, Regina Ferguson, Pia Krogh, Marie Letzgus, Dominic Marsh, Habby Matharoo, Kate Newell, Sarah Robbins, Paul Stainthorp

  30. Thank you Graham Stone g.stone@hud.ac.uk @Graham_Stone This work is licensed under a Creative Commons Attribution 3.0 Unported License • http://library.hud.ac.uk/blogs/projects/lidp/ • http://eprints.hud.ac.uk/10210/

More Related