Stakeholders • Need to convince decision makers that library media program enhances school mission • Who are the major stakeholders? Put two major stakeholder groups on your Data Collection Plan, column 1.
Evidence-based practice • Demonstrating outcomes of making and implementing sound decisions in daily work and their impact on organization goals and objectives Todd, R. Evidence-Based Practice: Findings of Australian Study, 2002-2003.
What is important to each group for decision making? • Brainstorm a list of data important to one of the following groups: • School Board • Administrators - District and Building • Parents and Community Members • Teachers Put this information in column 2.
What library media program data is most valuable? • Demonstrate difference LM program makes in: • Content learning • Information literacy skills • Technology skills • Reading • Collaborative planning and teaching • Demonstrate impact of resources How do these match up with stakeholder needs - column 3. Loertscher, D.. California Project Achievement.
What data do we collect now? • In column 4, indicate data you already collect to meet stakeholder needs. • Add to this column throughout the rest of this session
Content learning Achievement in coursework Standardized test achievement Critical thinking Student motivation Personal responsibility for learning Independent thinking Interaction with others Teacher tests, performance assessments, collaborative rubrics Test item analysis Rubrics, observations, checklists Student or teacher interviews or focus groups Student learning
Standardized Tests: WKCE • LMS reinforces skills taught in the classroom • 1998 alignment to standards (Match to Terra Nova, Form A items) • Some standards not appropriate for pen and pencil test (e.g., media and research standards) • Grade 10: 1 item matches each of LA Standards E and F • Grade 8: 1 item matches LA Standard F • Grade 4: 1 item matches each of LA Standards E and F Wisconsin Knowledge and Concepts Examinations: Alignment to Wisconsin Model Academic Standards, 1998 (http://www.dpi.state.wi.us/oea/alignmnt.html))
WKCE Sample Question WKCE 8th Grade Reading Sample Question (http://www.dpi.state.wi.us/oea/read8itm.html)
WKCE Sample Question WKCE 10th Grade Social Studies Sample Question (http://www.dpi.state.wi.us/oea/ss8items.html)
Standardized Tests: Reading • Assessment Framework for Reading • Objectives supported by library program: • Analyzing literary and informational text • Evaluate and extend literary and informational text • Use of framework: • Match local curriculum to framework • Engage in discussions on where skills are taught and reinforced • Examine problem areas Assessment Framework for Reading in Grades 3 through 8 and 10, 2005 (www.dpi.state.wi.us/dpi/oea/wkce-crt.html)
Provide resources Story hours Research projects Assessment Framework for Reading in Grades 3 through 8 and 10, 2005 (www.dpi.state.wi.us/dpi/oea/wkce-crt.html)
Collaborative Rubric: Critical Thinking California Assessment Program 1990, History-Social Science Grade 11, Scoring Guide: Group Performance Task in Herman et al, 1992.
Student mastery of grade level benchmarks Information literacy skills Technology skills Independent use of skills Ability to transfer skills to new problems Student spontaneous and persistent use of information literacy skills and inquiry Teacher assessment of skills in curricular projects LMS assessment of information literacy skills in lessons Student self-assessments of skills 8th grade technology assessment Analysis of standardized test items connected to information literacy skills Student research logs Conferences with students (reflect on work, skills and benefits) Examination of sample of student products or portfolios Teacher discussions or interviews on student skills Feedback from teachers and/or LMS at next level Student interviews, surveys or focus groups Teacher or LMS observations (checklists) Information and Technology Literacy
Effectively interprets and synthesizes information Mastery: Information Skills Rubric Marzano, Pickering, and McTighe. Assessing Student Outcomes. MCREL, 1993, p. 96.
I find meaning in information and then combine and organize information to make it useful for my task. Mastery: Student Information Skills Rubric Marzano, Pickering, and McTighe. Assessing Student Outcomes. MCREL, 1993, p. 122.
Mastery: Information Skills Rubric • Accurately assess the value of information Marzano, Pickering, and McTighe. Assessing Student Outcomes. MCREL, 1993, p. 97.
Mastery: Information Skills Rubric Rubrics for the Assessment of Information Literacy based on the Information Literacy Guidelines for Colorado Students and School Library Media Specialists, 1006 (DRAFT)
Mastery: Online Self-Assessment • Student checklist of skills used during research • AASL Power Learner Survey • Locally developed web form • Tie to database (FileMaker, Access) • Email submission from form using CGI script
Mastery: Online Assessments • Locally created test • Discovery School example (http://school.discovery.com/quizzes31/eileenschroeder/Research.html) • College online assessments • Cal Poly - Ponoma • ETS’s ICT Literacy Assessment (being tested) • Raritan Valley Community College • Cabrillo College
Mastery: Technology Skills Assessments • 8th grade tech assessment • NETS Online Technology Assessment • NCREL / NETS for Students: Extended Rubric • Bellingham Technology Self-Assessments
Observations • Observe small groups and different groups • Use checklists for consistency • Observe in different subject areas • Be focused and limited in scope • Observe what can’t be measured in product • Options: • Several observations over short period (deep view) • Single observations on regular basis (breadth) • Partner with others to do observations Improve Your Library: A Self-Evaluation Process for Secondary School Libraries and Learning Resource Centres. Department for Education and Skills.
Creating Surveys • What do you really want to know? • Who will have the answer? • Are the questions and instructions clear and not open to multiple interpretations? • Are questions and instrument as brief as possible? • Will answers be selected from options (yes/no, ranking, rating) or open-ended? • Structure • Start with straightforward question • Move from general to specific • Group questions by topic - use subheadings • Are embarrassing or leading questions excluded? • Do questions ask for personal opinion? Is this wanted? Examples: • Self-assessment checklist on skills • Self-assessment of technology skills • Survey Monkey survey on independent use A Planning Guide to Information Power: Building Partnerships for Learning. AASL, 1999.and Improve Your Library: A Self-Evaluation Process for Secondary School Libraries and Learning Resource Centres. Department for Education and Skills.
Interviews / Focus Groups • Make purpose clear • Script questions but adapt language • Have follow-up questions ready • Record answers (tape or by hand) but don’t let this interfere with your attention to respondent • Select interview site free from interruptions • Get a range of students / teachers (users and non-users, grade levels, abilities, genders, ethnicity, subject areas) • May be more useful to interview students in groups • May get more honesty if LMS does not do interviews Improve Your Library: A Self-Evaluation Process for Secondary School Libraries and Learning Resource Centres. Department for Education and Skills.
Teacher Interview Questions: Use • Do students appear confident in working in the library? • Are the students self- motivated and able to work independently or do they need assistance in locating information they need? • Do the students appear to choose methods of working best suited to the information seeking task?
Interview versus Survey • Interview • Extended, open-ended answers • Adaptive questions • Reach small number but in more depth • Survey • Closed questions - range of possible answers known • Can use branching • Can reach larger number of people • Easier to conduct and tabulate
Choice to read voluntarily Enjoyment of reading Amount read Voluntarily As part of curriculum Access to reading materials Suitably challenging and varied selections Impact on reading comprehension Choice to read Student surveys, interviews or focus groups Reader self assessments Snapshot of reader advisory Amount read Reading inventories (pre/post) Student reading logs Circulation statistics, ILL requests Track involvement in reading incentive activities Access Collection mapping Comprehension Analysis of library involvement in teacher unit plans for reading Teacher surveys, interviews or focus groups Standardized or local reading test score Accelerated Reader / Reading Counts points Reading
Reading Habits • Power Reader Survey (AASL) • KMMS Reading Inventory Online • Independent Reading Rubric (Cornwell) • Print version • Online survey • Reading Log
Collaboration • Schedules • Collaborative planning sheets • Prepared bibliographies • Unit plans • Collaboration with teachers • Time spent or frequency of collaboration with teachers • Number and dispersion of teachers collaborating • Level of collaborative activity and LMS support • Gather resources for unit • Provide lesson ideas to meet student needs • Integrate information and technology literacy skills in curriculum • Teach information or technology skills • Quality of learning experience: Integration of information and technology literacy skills • Types of assignments - Higher level thinking • Teacher use of an information problem solving model • More use of resources • Impact on content learning and information skills • Level of student engagement • Post-unit reflections • Interviews, surveys or focus groups • Unit and/or lesson plans • Assessment of information literacy skills & content knowledge • Curriculum maps
Planning Sheets Stacy Fisher. and Jane Johns. Milton Middle School
Post-Unit Review Unit title: Timeframe for unit: Teacher: # of students What worked well? Suggestions for improvement: Time spent on teaching information literacy / technology Information & technology skills / standards learned: From both the LMS’s and the teacher’s point of view was the unit enhanced by collaboration? Yes No Why? Was the unit successful enough to warrant doing it again? Yes No Why? How well was the unit supported by: (5=excellent, 4=above average, 3=average, 2=below average, 1=poor) The collection The web resources Diversity of formats 5 4 3 2 1 5 4 3 2 1 Recency 5 4 3 2 1 5 4 3 2 1 Number of items 5 4 3 2 1 5 4 3 2 1 Reading level 5 4 3 2 1 5 4 3 2 1 Technology 5 4 3 2 1 5 4 3 2 1 What materials / technology will we need if we are planning the unit again? Attach a list of resources used and/or found useful. Adapted from Loertscher and Achterman (2003). Increasing Academic Achievement through the Library Media Center, p. 17.
Log sheets Stacy Fisher and Jane Johns. Milton Middle School
Tracking Collaborative Units • Impact! • Collaboration profile • Activities • Hours spent • Learning venues • Difficulty level of units • Content area profile • Resource profile • Research skills profile (can track 3-9 skills) • Collaboration timeline
Resources: Actual and Perceived • Range, appropriateness, level, and amount of resources for curricular needs and student interests • Organization, accessibility and use of resources, space, and technology by staff and students • In LMC, classroom, over network, from home • During and outside school hours • Circulation of resources • Use of online resources • Staff expertise and availability • Collection mapping tied to curriculum • Library and lab sign-up schedules • Post-unit assessment of resources • Post-unit student assessment • Circulation statistics • Logs of online resource use • Interviews or focus groups • Satisfaction surveys
Use Tracking: Day Sample Val Edwards. Monona Grove High School.
Use Tracking: Quarterly Sample Val Edwards. Monona Grove High School.
Data Collection Plan • What is the most important data in your school? • Input and output • Triangulation • Where is this data available? From whom? • How are you going to collect it? • Events • Instruments • What is timeline for data collection? • Who will be responsible? • What resources are needed to collect data? • Who will test out the data collection instruments? • How will the results be analyzed? By whom? • How will the results be used? • To whom and how will the results be communicated?
Tips • Keep it simple • Minimum amount of information to demonstrate impact • Merge in daily routines • Identify where to best spend minimal time to be effective in collecting data • Be systematic • Use different types of evidence • Use both objective and subjective data • Consider samples of data • Collect data at opportune events • Plan for analysis when developing data collection
Data Collection Plan • Come up with at least one type of data to collect in fall that would be important to major stakeholder group • Fill in columns 4, 5, 6, 7
Data Analysis • Statistics • Frequency distribution • Mean, median, mode • Standard deviation • Change over time (pre/post) • Subgroup analysis • Trends • Ratios • Charts and graphs
Consider when Analyzing Data • Opportunity, reality, perception • Is analysis in-depth and comprehensive? • Are trends identified as well as strengths and weaknesses? • Has analysis been done by appropriate sub-groups? • Does data provide “big picture”? • Are there comparisons to similar schools or benchmarking studies? • Graphic overviews provided? Fitzpatrick (1998). Program Evaluation: Library Media Services
Presenting Results • Appropriate audience • Principal • District administration • Board • Parents / community • Frequency of presentation • Quarterly report • Annual report • Special event • Format of presentation • Memo • Formal report • Brochure • Oral presentation (with or without media) • Mass media (letter to the editor, mailing, webpage)
Consider when presenting results • Does the report highlight factors important to the audience? • Does it tie to the mission and goals of the school and library program? • Does the report emphasize outputs? • Do graphic depictions show relationships? • Does executive summary provide clear description? • Is the language appropriate for the audience? Does it avoid jargon? • Does it provide plans for the future? • Does it build on previous years’ reports and activities? Fitzpatrick (1998). Program Evaluation: Library Media Services