1 / 41

Assessing Your Instruction Program

Assessing Your Instruction Program. Texas Library Association April 7 2005 Michelle Millet Information Literacy Coordinator, Trinity University, San Antonio, TX Yvonne Nalani Meulemans Science Librarian, Calif. State University, San Marcos, CA. Overview.

jun
Download Presentation

Assessing Your Instruction Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing Your Instruction Program Texas Library AssociationApril 7 2005 Michelle Millet Information Literacy Coordinator, Trinity University, San Antonio, TX Yvonne Nalani Meulemans Science Librarian, Calif. State University, San Marcos, CA

  2. Overview • History of Assessment and Information Literacy • Best Practices • Assessment at CSU-San Marcos • Assessment at Trinity • Programmatic assessment • Other opportunities

  3. Assessment of IL: a brief history • 1. Higher education assessment movement • 2. Strategic planning and TQM adoption • 3. Evolution of information literacy

  4. Activity 1 • What does assessment mean to you?

  5. Best practices documents • ACRL Best Practices Initiative: Characteristics of Programs of Information Literacy that Illustrate Best Practices, 2003 • Describes program evaluation and student outcomes as areas of assessment. • Lists some examples of the types of assessment methods. • Standards for Libraries in Higher Education • Covers all areas, but has relevant sections detailing assessment in libraries as well as what instruction programs ought to include.

  6. 9 Principles of Good Practice for Assessing Student Learning • From American Association of Higher Education • Overarching principles to guide assessment • 6.Assessment fosters wider improvement when representatives from across the educational community are involved. • 8. Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change.

  7. Student learning outcomes Refers to the measurement of what a student actually learned in the course. Pre-tests and post-tests Annotated bibliographies Classroom assessment techniques (Angelo and Cross) What to assess?

  8. What to assess? continued • Student perceptions Measures the perceptions that students have about the course content, the instructor’s knowledge and teaching ability, and what the students feel they learned from the course. Student evaluation forms Surveys Focus groups

  9. What to assess? continued • Progress towards goals and objectives Refers to what some know as ‘inputs and outputs.’ Statistics on number/duration of courses, instruction sessions Number of hits to a web site Student follow-up appointments, drop in visits, emails

  10. Small steps can take you far • Consult your institution’s/library’s mission/values/goals documents. • Is IL articulated in campus/library documents? • Do you already keep basic statistics on students reached? Time spent preparing classes? • This can be valuable evidence. • What kind of in-class activities do you already do? • Collecting student work can be assessment.

  11. Activity 2. • Identify one “small step” that you would like to take next.

  12. Assessment in the ILP @ CSUSM • CSUs are teaching-centered, primarily undergraduate campuses. • CSUSM has appx 6000 FTEs. • Librarians are faculty and work with instructors in virtually every department.

  13. Students get a basic outline which they fill in during the class. Students have a ‘product’ for their time with you. You can route back to instructor to illustrate what students learned. Psychology 230: Research Workshop Research Tools Discussed PsycInfo: database of psych stuff, some full-text, use thes. Buros’ MMY: a source of psych measures, see psych research guide for e-version Using PsycINFO Thesaurus: identify right words, breaking up=relationship termination Boolean: stress and college, teen* or adolescen* *gets plurals ‘Empirical studies’ and ‘literature reviews’ ES have: experiments, empirical=observed, critique 3 for assignment! LR have: analyzes what is known about topic. Compilation of ES. Questions I have: What are Yvonne’s office hours?? Guided notes: measuring student learning

  14. Research chart: measuring student learning

  15. Student evaluation forms:measuring student perception

  16. Assessment or Evaluation? • What do we mean by assessment anyway? • Assessment focuses on the end result • Evaluation focuses on the overall process

  17. Student-centered Learning Instruction/Pedagogy Learning Outcomes Assessment Students Instruction Program

  18. Assessment and Collaboration • Teaching faculty want to know… • Assessment in the classroom • Use it as outreach to the teaching faculty • Share results and use as dialog for assignment design • Build an integrated information literacy program within a distinct discipline • Build tailored first-year experience programs

  19. Activity 3. • Who are your allies? Name three prospective collaborators.

  20. Trinity University and IL • Information Literacy is relatively new • Challenges to change • 2400 FTE, primarily liberal arts campus

  21. Trinity’s Answer • Build an IL program and assess along the way • What is an IL program? • Assessment was internally and externally focused • The end result is a program, fully integrated into the curriculum

  22. Where do I begin? • How do you know what you are doing/building is effective? • What is most important to administrators? • How can you help those who may be working against you? • Example: FYS and Lunches at Trinity (other people’s assessment data!)

  23. First Step • Where are you now? • State of Information Literacy and Instruction • What do you want to do next? • Short term and long term goals. Know the difference. • Go for the most impact!

  24. Activity 4. • What is your goal? What do you want to achieve next?

  25. Low Effort/High Impact • Creating programmatic assessment takes time and work • Begin with low effort/high impact items • Start tracking hours spent preparing for class • Start tracking NEW classes coming in for instruction

  26. Best Practices as a Guideline for Assessment • Great tool to use as a guideline • Collaboration/Integration Goal of Document • Check off what you’ve done • Check what you can do now (short term) • Which characteristics will be your guide for the future?

  27. Activity 5. • Is your “small step” LE/HI? • If not, do you have an alternative idea?

  28. Creating a Plan • From your long term and short term goals • Internal focus • (teaching, hours, # of classes) • External focus • (who are the active faculty, creating outcomes and standards, long-term studies) • Focus on pieces: It WILL come together!

  29. Activity 6. • One short term goal: • One long term goal:

  30. Outcomes for your Plan • Example: Assessing Program Effectiveness (External) • Outcome: Encourage broad faculty participation in instruction in order to ensure information literacy reaches across the curriculum. • Assessment: Collect data concerning overall number of classes and departments using instruction, looking for increases in numbers of classes or departmental participation.

  31. Outcomes Continued • Assessing Program Effectiveness (Internal) • Outcome: Hold workshop for librarians about integrating outcomes assessment in classes in order to help them be more effective in their teaching. • Assessment: Have each librarian create three outcomes for every instruction session…Collect data from their assessments.

  32. Activity 7. • Write one outcome using either your long term or short term goal. • What do you want to know? Why? • Don’t forget “in order to”

  33. Program Assessment Can: • Improve Teaching • Improve Student Learning • Provide Validation 

  34. Five Questions • What do you want the instruction program to be able to do? • What factors does the program demonstrate or exhibit if they are meeting the outcomes? • How will you gather your data or evidence? • How will you determine if you have achieved your outcomes? • How will the discussion or evaluation of the data occur? Who will be involved?

  35. Questions Translate Into… • Outcome • What does an effective program do? • Indicator • What will be happening? • Assessment • How will the data be collected? • Criteria • How will you know? Immersion 2004

  36. Other Opportunities • SAILS • Standardized Assessment of Information Literacy Skills • ETS ICT Project • LibQual+

  37. An Ongoing Culture of Assessment “Assessment is an ongoing process aimed at understanding and improving student learning.” Tom Angelo, AAHE Bulletin, 1996 ACRL Immersion 2004

  38. Assessment Plans: Individual or Group Work

  39. Session Evaluation

  40. Thank You !!

More Related