1 / 29

Project SAILS: Facing the Challenges of Information Literacy Assessment

Project SAILS: Facing the Challenges of Information Literacy Assessment. Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004 Conference Denver Colorado. What is information literacy?.

mea
Download Presentation

Project SAILS: Facing the Challenges of Information Literacy Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Project SAILS: Facing the Challenges of Information Literacy Assessment Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004 Conference Denver Colorado

  2. What is information literacy? • Ability to locate, access, use, and evaluate information efficiently and effectively. • Guiding document: “Information Competency Standards for Higher Education” – Association of College & Research Libraries(http://www.ala.org/ala/acrl/acrlstandards/informationliteracycompetency.htm)

  3. Our questions • Does information literacy make a difference to student success? • Does the library contribute to information literacy? • How do we know if a student is information literate?

  4. The Idea of SAILS • Perceived need – No tool available • Project goal – Make a tool: • Program evaluation • Valid • Reliable • Enables cross-institutional comparison • Easy to administer for wide delivery • Acceptable to university administrators

  5. Project parameters • Test • Systems design approach • Measurement model – Item Response Theory • Tests cohorts of students (not individuals) • A name • Standardized Assessment of Information Literacy Skills

  6. The project structure • Kent State team • Ohio Board of Regents collaborative grant with Bowling Green State University (part for SAILS) • IMLS National Leadership Grant • Association of Research Libraries partnership • www.projectsails.org

  7. Technical components • Environment • Item builder • Survey builder • Survey generator • Report generation • Challenges

  8. Environment • Linux (Red Hat) • Apache • MySQL • PHP

  9. Survey process • Create survey questions (items) • Create survey for this phase • Add schools for this phase • Schools create web front-end • Collect data

  10. Item Builder

  11. Item maintenance

  12. Survey Builder

  13. Survey items

  14. Random selection of items

  15. School information

  16. SAILS front-end

  17. Redirection to SAILS web site Parameters passed: • Unique student identifier • School code • Authorization code

  18. Link test

  19. Demographic data

  20. Survey questions

  21. Report process • Send schools unique identifiers • Upload demographics • Scan & upload paper surveys • Generate entire dataset file • Offline IRT analysis • Upload IRT results • Generate reports

  22. Sample report text

  23. Sample report graph

  24. Technical challenges • Creation of the front-end • Customizations for schools • Automating the data analysis • Supporting different languages

  25. Data analysis • Item Response Theory • Measures ability levels • Looks at patterns of responses • For test-takers • For items (questions) • Based on standards and skill sets • Show areas of strength and areas of weakness

  26. Status • Instrument • 126 items developed, tested, and in use • Web-based and paper-based administration • Grant Project - IMLS • Phase I complete - 6 institutions • Phase II complete - 34 institutions • Phase III began June 2004 - 77 institutions

  27. Next steps for SAILS • Analyze data and other input • Administrative challenges • Self reported demographic data • Testing environment • Report generation • Does the instrument measure what we want it to? • Are institutions getting what they need?

  28. Summary • Vision: • Standardized, cross-institutional instrument that measures what we think it does • To answer the questions: • Do students gain information literacy skills? • Does information literacy make a difference to student success?

  29. For more information • www.projectsails.org • sails@kent.edu • Julie Gedeon, jgedeon@kent.edu • Carolyn Radcliff, radcliff@kent.edu • Rick Wiggins, rwiggins@lms.kent.edu • Mary Thompson, project coordinator mthomps1@kent.edu; 330-672-1658

More Related