1 / 26

Assessing Information Literacy Skills: A Look at Project SAILS

Assessing Information Literacy Skills: A Look at Project SAILS. Joseph A. Salem, Jr. Kent State University ARL New Measures Initiatives CREPUQ February 11, 2005. Context. Explosion of interest in information literacy Accountability Assessment Formative: for planning/improvement

stamos
Download Presentation

Assessing Information Literacy Skills: A Look at Project SAILS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing Information Literacy Skills: A Look at Project SAILS Joseph A. Salem, Jr.Kent State University ARL New Measures Initiatives CREPUQ February 11, 2005

  2. Context • Explosion of interest in information literacy • Accountability • Assessment • Formative: for planning/improvement • Summative: evidence/documenting

  3. What Is an Information Literate Person? • Information Power - American Association of School Librarians • 9 standards • Big6 • Information Competency Standards for Higher Education – ACRL • 5 standards, 22 performance indicators, 87 outcomes, 138 objectives

  4. Our Questions • Does information literacy make a difference to student success? • Does the library contribute to information literacy? • How do we know if a student is information literate?

  5. The Idea of SAILS • Perceived need – No tool available • Project goal – Make a tool: • Programmatic evaluation • Valid • Reliable • Cross-institutional comparison • Easy to administer for wide delivery • Acceptable to university administrators

  6. The Project Structure • Kent State team • Librarians, programmer, measurement expert • Association of Research Libraries partnership • Ohio Board of Regents collaborative grant with Bowling Green State University (part for SAILS) • IMLS National Leadership Grant • Working with many institutions • www.projectsails.org

  7. Project Parameters • Test based on ACRL document • Test development model: Systems design approach • Measurement model: Item Response Theory • Tests cohorts of students (not individuals) – programmatic assessment • A name • Standardized Assessment of Information Literacy Skills

  8. Test Development Systems design approach: 1. Determine instructional goal 2. Analyze instructional goal 3. Analyze learners and contexts 4. Write performance objectives 5. Develop assessment instrument The Systematic Design of Instruction. 6th ed. By Walter Dick, Lou Carey, James O. Carey. Boston: Pearson/Allyn and Bacon, c2005.

  9. ACRL Standards – Significant Challenges • Breadth and depth • Objectives • Multi-part • Multi-level • Habits/behaviors versus knowledge

  10. Consider Skill Sets • Regrouping the ACRL objectives (and some outcomes) • 12 sets of skills organized around activities/concepts • More closely mirrors instructional efforts?

  11. Item Development Process Review competencies and draft some items: • "How can I know that a student has achieved this competency?" • Formulate a question and answers. This may take several iterations. • Develop additional responses that are incorrect, yet plausible. Aim for five answers total.

  12. Testing the Test Items • Conduct one-on-one trials • Meet with individual students, talk through test items • Conduct small group trials • Administer set of items to group, engage in discussion after • Conduct field trials • Administer set of items to 500+ students, analyze data

  13. Measurement Model • Item Response Theory • Also called Latent Trait Theory • Measures ability levels • Looks at patterns of responses • For test-takers • For items • Rasch measurement using software program “Winsteps” (www.rasch.org)

  14. Easier questions Harder questions Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Person A C C C C  C Person B C C C C  C Person C C C C C C Person D C C C C C C C C Response Pattern Example C = gave correct answer

  15. Data Reports • Based on standards and skill sets • Looking at cohorts, not individuals • Show areas of strength and areas of weakness

  16. The Person-Item Map • Plots items according to difficulty level • Plots test-takers according to their patterns of responses • Can mark average score for cohorts • Cross-institutional average • Specific institution average

  17. The Bar Chart • Another representation of the information • Group averages • Major, class standing, etc. • Which groups are important to measure? • How do you know which differences in means are important?

  18. Current Instrument Status • 158 items developed, tested, and in use • Most ACRL learning outcomes covered • Not Standard 4: Uses information effectively to accomplish a specific purpose. • 12 skill sets developed based on ACRL document

  19. IMLS Grant Status • Phase I complete • 6 institutions participated • Feedback from institutions • Phase II underway • 36 institutions participated • Phase III started June 2004 • About 70 institutions participating • Wrap-up summer 2005

  20. Project Highlights • Discipline-specific modules • Canadian version of the instrument • Automated survey generation • Automated report generation

  21. Next Steps for SAILS • IMLS grant period ends on September 30, 2005 • Stop administering SAILS to allow analysis of the instrument • Does the instrument measure what we want it to? • Are institutions getting what they need?

  22. Next Steps for SAILS • Analyze data and input from institutions • Validate the instrument • Factor analysis and skill sets • Outside criterion testing through performance testing • Test-taker characteristics • Sex, ethnicity, class standing, GPA • Test administration methods

  23. Next Steps for SAILS • Re-think how results can be presented or used • Scoring for the individual • Pre and post-testing • Cut scores • Administrative challenges • Automate data analysis • Re-engineer administrative tools • Create customer interface • Test development • Develop new items

  24. Summary • Vision: • Standardized, cross-institutional instrument that measures what we think it does • To answer the question: • Does information literacy make a difference to student success?

  25. For More Information • www.projectsails.org • sails@kent.edu • Joseph Salem, jsalem@lms.kent.edu • Mary Thompson, project coordinator mthomps1@kent.edu; 330-672-1658

  26. Questions?

More Related