1 / 63

Data Interpretation Workshop

Data Interpretation Workshop. 2007. Purposes for the Day. Bring context and meaning to the math and reading assessment project results; Initiate reflection and discussion among school staff members related to the math and reading assessment results;

laura-kane
Download Presentation

Data Interpretation Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Interpretation Workshop 2007

  2. Purposes for the Day • Bring context and meaning to the math and reading assessment project results; • Initiate reflection and discussion among school staff members related to the math and reading assessment results; • Encourage school personnel to judiciously review and utilize different comparators when judging math and reading assessment results; • Model processes that can be used at the school-and division-level for building understanding of the data among school staff and the broader community; and, • Provide an opportunity to discuss and plan around the data Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  3. Agenda • Understanding data—sources, categories & uses • Saskatchewan Assessment for Learning Program • Working with the AFL Reports • Standards and Cut Scores • Predicting • Sharing • Analysis of Strengths & Areas for Growth • Creating & Testing Hypotheses • Action Planning Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  4. 32° • What might the above piece of data mean? • While 32° is data, the meanings you provided were interpretation. • All data is meaningless until interpreted. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  5. A Data-Rich Environment Wellman & Lipton (2004) state: Schools and school districts are rich in data. It is important that the data a group explores are broad enough to offer a rich and deep view of the present state, but not so complex that the process becomes overwhelming and unmanageable. Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  6. International Data Sources • Programme for International Student Assessment (PISA) http://snes.eas.cornell.edu/Graphics/earth%20white%20background.JPG Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  7. National Data Sources • Pan-Canadian Achievement Program (PCAP) • Canadian Test of Basic Skills (CTBS) • Canadian Achievement Tests (CAT3) http://www.recyclage.rncan.gc.ca/images/canada_map.jpg Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  8. Provincial Data Sources • Assessment for Learning (AFL) • Opportunity to Learn Measures • Performance Measures • Departmentals http://regina.foundlocally.com/Images/Saskatchewan.jpg Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  9. Division Data Sources • Division level rubrics • Division bench mark assessments http://www.sasked.gov.sk.ca/branches/ed_finance/north_east_sd200.shtml Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  10. Local Data Sources • Cum Folders • Teacher designed evaluations • Portfolios • Routine assessment data Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  11. Definitive Indicative Individual Classroom School Division Provincial National International Student Evaluations System Evaluations Nature of Assessment Data From Understanding the numbers. Saskatchewan Learning Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  12. In-depth knowledge of specific students Individual Individual Classroom Classroom School School Division Division Provincial Provincial National National International International Little knowledge of specific students Assessments . Depth and Specificityof Knowledge In-depth knowledge of specific students In-depth knowledge of systems Assessments From Saskatchewan Learning. (2006). Understanding the numbers.

  13. Assessment for Learningis a Snapshot • Results from a large-scale assessment are a snapshot of student performance. • The results are not definitive. They do not tell the whole story. They need to be considered along with other sources of information available at the school. • The results are more reliable when larger numbers of students participate and when aggregated at the provincial and division level, and should be considered cautiously at the school level. Individual student mastery of learning is best determined through effective and ongoing classroom-based assessment. (Saskatchewan Learning, 2007) Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  14. Using a Variety of Data Sources • Thinking about the data sources available, their nature and the depth of knowledge they provide, how might the information in each impact the decisions you make? • What can you do with this data? • What is its impact on classrooms? • Please refer to the “Using a Variety of Data Sources” template in your handout package as a guide for your discussion. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  15. Local Level Sources of Data While international, national and provincial sources of data can provide direction for school initiatives, the data collected at the local level is what provides the most detailed information regarding the students in classrooms. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  16. Local Data Descriptive information such as enrollment, attendance, gender, ethnicity, grade level, etc. Can disaggregate other data by demographic variables. Assessment for Learning Opportunity-to-Learn Data Home support for learning reading, and learning math. Four Major Categories of Data: Demographics Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  17. Local Data Describes outcomes in terms of standardized test results, grade averages, etc. Assessment for Learning Data: Opportunity-to-Learn Data Know and use reading strategies Student performance outcomes Math 5,8,20 Reading 4,7,10 Four Major Categories of Data: Student Learning Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  18. Local Data Provides information regarding what students, parents, staff and community think about school programs and processes. This is data is important because people act in congruence with what they believe. Assessment for Learning Opportunity-to-Learn Data Preparation for and commitment to learn Persistence Four Major Categories of Data:Perceptions Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  19. Local Data What the system and teachers are doing to get the results they are getting. Includes programs, assessments, instructional strategies and classroom practices. Assessment for Learning Opportunity-to-Learn Data Instruction and learning Availability and use of resources Approaches to problem solving (Math) Four Major Categories of Data:School Processes Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  20. What Data are Useful & Available? • Think about the goals/priorities set within your school and/or school division – they might be achievement, growth, behavioural, etc. • Using the supplied template, begin to catalogue the data you already have and the data you need in order to better address the goals that have been set. • An example follows on the next slide. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  21. Goal: Students will experience a greater success in all subject areas as we focus on reading skills. Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

  22. Principles of the Saskatchewan AFL Program • Cooperation and Shared Responsibility • Equity and Fairness • Comprehensiveness • Continuous Improvement that Promotes Quality and Excellence • Teacher Professionalism • Authenticity and Validity • Honesty and Openness Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  23. Principles of the Saskatchewan AFL Program • Read and Connect • Everyone at the table reads the first numbered statement. • After reading, one person at the table offers an insight or connection they are making to that statement. • Repeat the process with all seven statements. • At the end, discuss the key ideas and concepts within the principles. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  24. Comparators: Types of Referencing • Criterion-referenced: Comparing how students perform relative to curriculum objectives, level attribution criteria (rubrics) and the level of difficulty inherent in the assessment tasks. (Tables 8.8 & 8.12) • Standards-referenced: Comparing how students performed relative to a set of professionally or socially constructed standards. (Figure 8.2c, Table 8.3, Figure 8.4a, Figure 8.6a, and others.) From: Saskatchewan Learning. (2006). Understanding the Numbers Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  25. Comparators: Types of Referencing • Experience or self–referenced: Comparing how students perform relative to the assessment data gathered by teachers during the school year. (E.g.. Comparing these results to current school data. The standards set by the panel.) • Norm-referenced: Comparing how students in a school performed relative to the performance of students in the division, region or project. (E.g.. Tables comparing the school, division and province.) Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  26. Comparators: Types of Referencing • Longitudinal-referenced: Comparing how students perform relative to earlier years’ performance of students. (Table 8.1, Figure 8.3 and others.) Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  27. What Data are Collected and Reported for Reading? • Reading Comprehension Skills: 60-item multiple-choice test (organized by reading strategies). The six categorized reading strategies are: • Using Cueing Systems • Connecting to Prior Knowledge • Making Inferences, Predictions, and Drawing Conclusions • Noting Key & Finding Supporting Ideas • Summarizing, Recalling, synthesizing and Organizing Information • Recognizing Author's Message & Craft Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  28. What Data are Collected and Reported for Reading? • Explicit Comprehension: a subset of the 60-item multiple-choice test involving responses to ideas or information stated directly in the text. The answers are “right there” in the text. • Implicit Comprehension: a subset of the 60-item multiple-choice test requiring the reader to apply background knowledge to interpret or infer ideas or information in the text. Interpreting vocabulary or visuals and making predictions are forms of inference. • Critical Comprehension: a subset of the 60-item multiple-choice test involving responses to ideas and information that require inferences and critical analysis. Looking at author's purpose and point of view, distinguishing facts from opinions and recognizing persuasive techniques are all components. • Numbers of questions in the subsets vary according to grade level • Reader Response: written-response question(s) assessing students' ability to make meaning from text by making connections to personal knowledge or experience (extending and applying new understandings

  29. Opportunity to Learn Measures for Reading • Data was collected from students in 3 areas: • Preparation and commitment to learn • Knowledge and use of reading strategies • Home support for reading • Data was collected from teachers reported on 2 classroom-related elements: • Availability and use of resources • Instruction and learning Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  30. What Data are Collected and Reported for Math? • Math Content Skills: 40-item multiple-choice test (organized by mathematical strands and linked to curriculum objectives)... The five strands vary by grade level.

  31. What Data are Collected and Reported for Math? • Applications & Problem Solving • Concepts, Procedures & Relationships • Challenges – Performance on challenges is reported on a 5 level scale. • Calculator Skills • Computation Skills • Estimation Skills Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  32. Opportunity-to-Learn Measures in Math • Data was collected from students in 4 areas: • Preparation and commitment to learn • Persistence when experiencing difficulty • Home support for learning in general • Home support for learning math • Data was collected from teachers reported on 3 classroom-related measures: • Availability and use of resources • Instruction and learning • Approaches to problem solving Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  33. Standards To help make meaningful longitudinal comparisons in future years, three main processes will be implemented. • Assessment items will be developed for each assessment cycle using a consistent table of specifications. • The assessment items will undergo 3 rounds of field-testing - one of which is intended to inform the comparability of the two assessments. • A process for setting of standards for each of the assessment items, so that any differences in difficulty between two assessments are accounted for by varying standards for the two assessments. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  34. Opportunity-to-Learn and Performance Standards • In order to establish Opportunity-to-Learn and Performance standards for the 2007 Reading Assessment, three panels were convened (one from each assessed grade), consisting of teachers and post-secondary academics including Education faculty. • The panelists studied each assessment item from the 2007 assessment in significant detail established cut-scores for each of the assessment components. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  35. Cut Scores • On page 4 of the detailed reports you will find the cut scores detailing the percentage correct required for students to be classified at one of two levels: • Threshold of adequacy • Threshold of proficiency • Reader response and Math challenge scores are presented on a five-level scale (1-low to 5-high). Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  36. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  37. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  38. Locating Cut Scoresin the Report • Turn to pg. 4 in the detailed report for your grade level. • You will need to refer to these scores during the following prediction activity. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  39. Predicting

  40. AFL Math Percentage of Students who met the Adequate Standard set by Saskatchewan Educators

  41. Shade in your prediction on the supplied prediction chart. AFL Math Percentage of Students who met the Adequate Standard set by Saskatchewan Educators Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

  42. Based on your predictions, create a set of hypotheses for some or all of them. As you create each prediction, identify the underlying assumptions. Prediction – ‘X’ will contain the highest scores. Assumption – we created a common assessment for ‘X’ in 2005. Prediction – students will report higher on geometry because we moved that unit earlier in the year. Assumption – there are fewer classroom interruptions earlier in the year and students have more time to learn the material. Write each prediction and its accompanying assumption on the cards provided. Please write legibly. Predicting Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  43. Sharing • Gather the cards together at your table and discuss the predictions and assumptions. • Do these statements ring true for everyone at your table? School? Division? • Considering all of the predictions, are there any themes or patterns emerging? • Why might this be? Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  44. Comparisons The completed bar graphs are in the Summary Report on page 3. • What are you noticing about the data? • What surprised you? • What other data would you like to see to better inform the results you’ve seen so far? Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008) Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

  45. Using the Prediction & Comparison Process • What are the benefits of approaching data in this manner? • Hints Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  46. Examining the Report • Take a few minutes to look through the entire AFL report. Use the chart below to guide your thinking and conversation. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  47. Designing Interventions • Assumptions must be examined because our interventions will be based on them. • We must strive to correctly identify the causal factors. • Don’t fall in love with any theory until you have other data. • Use a strength-based approach to interventions. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  48. Team Action Plan What are some areas of strength indicated within your data? What are some areas for improvement indicated within your data? Please consider all aspects of both reports including the Opportunity to Learn Measures. Fishbone At your table, analyze one strength and consider all contributing factors that led to that strength. Consider one area of improvement and transfer those elements from your area of strength (as applicable) that could contribute to improvement in this area. Team Action Plan/Fishbone Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

  49. Focusing on Improvement • Thinking of the area of weakness you identified earlier, create a hypothesis that uses some of the positive elements from the area of strength that addresses an area for improvement. • Similar to process lists in math, providing detailed strategy lists for student reading will increase the number of strategies used and therefore increase comprehension. Advancing Assessment Literacy Modules: Data Interpretation Workshop (February 2008)

More Related