1 / 80

Practice Makes Perfect: applying and adapting best practices in information literacy

Practice Makes Perfect: applying and adapting best practices in information literacy. Sheril Hook Esther Atkinson Andrew Nicholson Instruction Coordinator Liaison Librarian GIS/Data Librarian University of Toronto Mississauga. WILU Conference, May 18, 2007.

kiona
Download Presentation

Practice Makes Perfect: applying and adapting best practices in information literacy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Practice Makes Perfect: applying and adapting bestpractices in information literacy Sheril Hook Esther Atkinson Andrew Nicholson Instruction Coordinator Liaison Librarian GIS/Data Librarian University of Toronto Mississauga WILU Conference, May 18, 2007

  2. Agenda • IL Program Development (Sheril) • Category 5: articulation with the curriculum • Examples of BP Category 5 (Andrew) • research-based learning • IL learning outcomes • IL Program Development (Sheril) • Category 10: Assessment/Evaluation • Examples of BP Category 10 (Esther) • data and its impact on instruction and planning

  3. ALA/ACRL Characteristics of Programs of Information Literacy that Illustrate Best Practices Category 5: Articulation with the Curriculum Articulation with the curriculum for an information literacy program: • is formalized and widely disseminated; • emphasizes student-centered learning; • uses local governance structures to ensure institution-wide integration into academic or vocational programs; • identifies the scope (i.e., depth and complexity) of competencies to be acquired on a disciplinary level as well as at the course level; • sequences and integrates competencies throughout a student’s academic career, progressing in sophistication; and • specifies programs and courses charged with implementation. http://www.ala.org/ala/acrl/acrlstandards/characteristics.htm

  4. IL Program Development: PlanningPart 1 • ACRL Best Practices Document • environmental scan • internal scan & internal development • external scan & external development • current state & next steps • Shared Philosophical Framework • training & development • informing our pedagogical practices • developing expertise as shared responsibility • use of IL Standards and terminology

  5. Environmental Scan • Core curricula • (horizontal/vertical integration in Part 2) • Departmental goals • Required courses for baseline expectations • Representation on curriculum committees • Movements in teaching/learning • student engagement

  6. Environmental Scan • Student Engagement • NSSE http://nsse.iub.edu/ • Peer learning, aka peer assisted learning, supplemental instruction • http://www.peerlearning.ac.uk/ • http://www.umkc.edu/cad/SI/index.htm • Re-invention Center http://www.sunysb.edu/Reinventioncenter/ • Inquiry-based, discovery, problem-based, or research-based learning

  7. http://www.reinventioncenter.miami.edu/BoyerSurvey/index.htmlhttp://www.reinventioncenter.miami.edu/BoyerSurvey/index.html

  8. http://www.reinventioncenter.miami.edu/pdfs/2001BoyerSurvey.pdfhttp://www.reinventioncenter.miami.edu/pdfs/2001BoyerSurvey.pdf

  9. Student Engagement • research-based learning • problem-based learning • inquiry-based learning • discovery learning knowledge building Scardamalia, M., & Bereiter, C. (2003).

  10. Shared Philosophical Framework • information literacy as concept • tool-based vs. concept-based teaching • other literacies, e.g., technology, media, spatial, data • inventory of current practices and outreach activities • articles & workshops that help develop framework • Learning theory • Bloom’s taxonomy • SOLO Taxonomy (Biggs) • development & use of assessment tools

  11. What is embedded IL? Embedded • Assignment(s) collaboratively developed with instructor. IL stated learning outcomes in instructor's course materials. Session by librarian may or may not have been delivered during class time (e.g., series of walk-in workshops) Integrated • Session content tailored to course assignment in consultation with instructor. Session may or may not have been delivered during class time (e.g., series of open workshops available to students). Session may or may not have been optional. Supplemental • Generic information literacy instruction; is not tied directly to course outcomes or an assignment. Session may or may not have been optional for students. Session may or may not have been delivered during class time. ANZILL, p6 ANZIL Framework, 2004 ACRL, 2007 Learning Commons, University of Guelph, n.d.

  12. IL Standards Standard One The information literate student determines the nature and extent of the information Performance Indicator 2. The information literate student identifies a variety of types and formats of potential sources for information. Outcomes include • Knows how information is formally and informally produced, organized, and disseminated • Recognizes that knowledge can be organized into disciplines that influence the way information is accessed • Identifies the value and differences of potential resources in a variety of formats (e.g., multimedia, database, website, data set, audio/visual, book) • Differentiates between primary and secondary sources, recognizing how their use and importance vary with each discipline • Realizes that information may need to be constructed with raw data from primary sources "Information Literacy Competency Standards for Higher Education." American Library Association. 2006.http://www.ala.org/acrl/ilcomstan.html (Accessed 15 May, 2007)

  13. Examples of IL Standards tailored and embedded into course curricula

  14. U of T Mississauga Library When we collaborate with our instructors on designing a class assignment, we emphasize • the Library Vision -“Leading for Learning” • the availability of thousands of Research and Information Resources through the U of T Libraries: • as of May 15, 2007: • 395,184 e-holdings including e-books, journals, newspapers, etc. • the key role of these resources in enhancing student engagement with their learning.

  15. U of T Mississauga Library We also stress to instructors that our electronic resources can be utilized • to enhance their instructional content. • to foster an active learning environment in the course. Students will begin to think both conceptually and critically about the material. • to develop information literacy competencies among the students, such as retrieving and critically evaluating information in any format. More details about information literacy can be found at the Association of College & Research Libraries (ACRL) website. http://www.ala.org/ala/acrl/acrlstandards/informationliteracycompetency.htm Many disciplines are now releasing their own information literacy standards, based on the ACRL model.

  16. Examples from • Social Sciences • Sciences • Humanities

  17. Assignment: Changes in Canadian Society • Outcomes • identify and locate statistics needed • evaluate statistics for use (do they cover the correct geography?, time period?, etc) • analyze statistics • communicate the results in term paper and presentation • acknowledge the use of information Social Sciences 1.

  18. Research Question • By examining census data related to occupation, how have women’s working lives changed in a 100 year period? Social Sciences 2.

  19. Outcomes • identify and locate statistics needed. • Students recognize that the Census collects statistics on occupation Social Sciences 3.

  20. Outcomes • evaluate statistics for use. • Students differentiate between census years and census geographies available. • Students identify value and differences of resources in a variety of formats. Social Sciences 4.

  21. Outcomes • analyze statistics • Students recognize the occupation categories being used 5. Social Sciences 5.

  22. Outcomes • analyze statistics • Students create a cross tabulation table between Occupation and Sex 1901 Census of Canada: Occupation by Sex 6. Social Sciences

  23. Outcomes • analyze statistics 2001 Census of Canada: Occupation by Sex • Students next identify and locate the • 2001 Census Variables relating to • occupation and sex. • On the next slide: • A 2001 Census cross tabulation is • then compared with 1901 Census • cross tabulation. • Students will recognize that • occupation categories will have • changed in the 100 year time span. • Students realize that the data can • be extrapolated into multiple • categories Social Sciences 7.

  24. 1901 Census of Canada: Occupation by Sex 2001 Census of Canada: Occupation by Sex • Outcomes • analyze statistics 8.

  25. Outcomes • communicate the results in term paper and presentation • Students add tables to term paper and also to a class slideshow presentation. • acknowledge the use of information 1901 Census of Canada Bibliographic Entry Canada. Statistics Canada. Census of Canada, 1901: public use microdata file – individuals file [computer file]. Victoria, B.C.: University of Victoria; Canadian Families Project [producer] [distributor]. January 2002. <http://myaccess.library.utoronto.ca/login?url=http://r1.chass.utoronto.ca/sdaweb/html/canpumf.htm> 2001 Census of Canada Bibliographic Entry Canada. Statistics Canada. Census of Canada, 2001: public use microdata file - individuals file [computer file]. Revision 2. Ottawa, Ont.: Statistics Canada [producer]; Statistics Canada. Data Liberation Initiative [distributor], 2006/04/26. (STC 95M0016XCB) <http://myaccess.library.utoronto.ca/login?url=http://r1.chass.utoronto.ca/sdaweb/html/canpumf.htm> Social Sciences 9.

  26. Examples from • Social Sciences • Sciences • Humanities

  27. Assignment: Cited Reference Searching in the Sciences • Outcomes • evaluate available resources to see if their scope will include citation tracking statistics and journal impact factor • locate and interpret the citation information Sciences 1.

  28. Research Question WYTTENBACH, R. and HOY, R. “DEMONSTRATION OF THE PRECEDENCE EFFECT IN AN INSECT” JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA; 94 (2): 777-784 AUG 1993. • Before including this reference in a paper, check to see how “reputable” both the article and the journal is in the discipline. Should it be included? Sciences 2.

  29. Outcomes • Evaluate available resources to see if their scope includes citation tracking. • Students recognize that journal articles have value in a particular discipline and that they can be measured in a variety of ways, including specialized citation indexes. Sciences 3.

  30. Outcomes • Evaluate available resources. • Students recognize the ability • to perform cited reference searching • in a variety of ways. Sciences 4.

  31. Outcomes • locate and interpret the citation information. • Students locate the citation and realizes that the authors consulted a variety • of sources (“Cited References”); and more importantly this citation has been • cited frequently (“Times Cited”) in the years since publication. Sciences 5.

  32. Outcomes • interpret the citation information. • Students can review the cited references from the article and examine • the origins of the research Sciences 6.

  33. Outcomes • interpret the citation information. • By checking the “Times Cited”, students gain insight into the impact of the • article in the discipline. Sciences 7.

  34. Outcomes • interpret the citation information. • Students also access the JCR to check the “Impact Factor” Sciences 8.

  35. Outcomes • interpret the citation information. • Students can also rank other journals in the discipline by impact factor. Sciences 9.

  36. Examples from • Social Sciences • Sciences • Humanities

  37. Assignment:Myth over Time • Outcomes • Explore the dynamism of myth by comparing and contrasting a selection of ancient and modern primary sources of a myth (at least one literary, one material) • Identify the most significant changes from ancient to modern source and discuss those changes in light of the context in which each source was created • Interpret those changes in terms of how they affect the meaning of the myth and how they came about in the first place Humanities 1.

  38. Research Question • How have myths changed over time? Humanities 2.

  39. Outcomes • compare and contrast a selection • of primary sources (art) • Students begin by finding primary sources--art works, music, scripts, opera and background information on artists Google has images, but no provenance information Camio has images, plus provenance and usage rights information Humanities 3.

  40. Outcomes • identify the most significant changes...in light of the context in which each source was created. Students build on the learning acquired by finding background information on a time period/place Humanities 4.

  41. Outcomes • identify the most significant changes...in light of the context in which each source was created. Students place a myth in the cultural context in which it’s being used or re-told Humanities 5.

  42. Outcomes • compare and contrast a selection of primary sources (music) Students listen to a symphony to identify the dynamism of the myth and interpret its significance Humanities 6.

  43. Summary • The U of T Mississauga Library provides access to thousands of digital and interactive resources for a variety of active and conceptual based learning activities. • These resources can be utilized to promote both student engagement and the embedding of IL standards and outcomes.

  44. ALA/ACRL Characteristics of Programs of Information Literacy that Illustrate Best Practices Category 10: Assessment/Evaluation Assessment/evaluation of information literacy includes program performance and student outcomes and: for program evaluation: • establishes the process of ongoing planning/improvement of the program; • measures directly progress toward meeting the goals and objectives of the program; • integrates with course and curriculum assessment as well as institutional evaluations and regional/professional accreditation initiatives; and • assumes multiple methods and purposes for assessment/evaluation-- formative and summative-- short term and longitudinal; http://www.ala.org/ala/acrl/acrlstandards/characteristics.htm

  45. ALA/ACRL Characteristics of Programs of Information Literacy that Illustrate Best Practices Category 10: Assessment/Evaluation (cont’d) Assessment/evaluation of information literacy includes program performance and student outcomes and: for student outcomes: • acknowledges differences in learning and teaching styles by using a variety of appropriate outcome measures, such as portfolio assessment, oral defense, quizzes, essays, direct observation, anecdotal, peer and self review, and experience; • focuses on student performance, knowledge acquisition, and attitude appraisal; • assesses both process and product; • includes student-, peer-, and self-evaluation; http://www.ala.org/ala/acrl/acrlstandards/characteristics.htm

  46. How are we teaching/Who are we reaching? • Reflective teaching practices • Teaching portfolios • Sharing with colleagues and course instructors • Evaluation and assessment • Student focus groups • Inventory of outreach & teaching • How are you reaching students? How many? • Who are current campus partners? • Who are potential campus partners? • Who will keep these relationships going? • As a group where are you teaching? • Horizontally and vertically

  47. IL Program Development: PlanningPart 2 • Assessment • standardized assessments (ETS, SAILS, JMU) • creation, use and reflection of assessments (background knowledge probe, muddiest point, observation, dialogue) • instruction database

  48. National standardized tools • iSkills™ (aka Information and Communication Technology (ICT) Literacy Assessment) developed by the Educational Testing Service. $35.00 US per student http://www.ets.org/ Measures all 5 ACRL Standards. Two test options: Core and Advanced. Computerized, task-based assessment in which student complete several tasks of varying length, i.e., not multiple choice. Intended for individual and cohort testing. 75 minutes to complete • Standardized Assessment of Information Literacy Skills (SAILS)developed by Kent State University Library and Office of Assessment. It is also endorsed by the Association of Research Libraries. $3.00 US per student (capped at $2,000), but we can also administer ourselves for free. https://www.projectsails.org/ Measures ACRL Standards 1,2,3,5. Paper or Computerized, multiple-choice. Intended for cohort testing only. 45 questions, 35 minutes to complete. • Information Literacy Test (ITL) developed by James Madison University (developed by JMU Libraries and Center for Assessment and Research Studies) http://www.jmu.edu/icba/prodserv/instruments_ilt.htm Measures ACRL Standards 1,2,3,5. Computerized, multiple-choice. Intended for cohort and individual testing. 60 questions, 50 minutes to complete. NPEC Sourcebook on Assessment: http://nces.ed.gov/pubs2005/2005832.pdf

  49. ETS: Advanced Level – Access • http://www.ets.org/Media/Products/ICT_Literacy/demo2/index.html

More Related