1 / 27

Achievement Exam Analysis

Achievement Exam Analysis. ED3604E/F February 5/9, 2010 adapted from materials created by Gerry Varty, Director of Instruction Wolf Creek Public Schools. Achievement Exams. Administered by Alberta Learning yearly to all students in Grades 3, 6, and 9 Purpose:

tejana
Download Presentation

Achievement Exam Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Achievement Exam Analysis ED3604E/F February 5/9, 2010 adapted from materials created by Gerry Varty, Director of Instruction Wolf Creek Public Schools

  2. Achievement Exams • Administered by Alberta Learning yearly to all students in Grades 3, 6, and 9 • Purpose: • Serves as a ‘dipstick’ measurement • Gauges general performance of a large population • Measures effectiveness of curriculum and initiatives designed to effect learning • Should be used to improve teaching and learning

  3. What they are not: • An effective measure of individual student ability • An effective measure of school performance in the short-term • A ‘gateway’ or filter to direct individual student promotion. • “High-Stakes” tests. Nobody fails, nobody gets fired, no school loses funding or autonomy over student performance on these exams.

  4. What they are: • Data, through which we can see trends indicating the effectiveness of initiatives (such as AISI) designed to improve student learning. • General measures of Program content and delivery over time • “Large-Scale” assessments – enough students write them to create a basis for consistent appraisals. • Only a piece of the larger assessment puzzle

  5. Those distinctions are important… • By using PATs for purposes they were designed for: • Government can use them to determine the general level of attainment over time • Schools can use them to gauge the effectiveness of program enhancement, or to serve as a springboard for discussion and professional development • Parents can use them to get an idea of how their kids perform (relative to the large group) on that specific sample of outcomes.

  6. Those distinctions are important… • Using PATs for purposes they were not designed creates issues such as: • Schoolworks, Fraser Institute, etc ‘ranking’ schools (or teachers) based on aggregate PAT data • PAT data being used to determine grade or program placement for individual kids • This data is only one part of the story.

  7. What to look for: • Performance over time that indicates strengths or weaknesses in Programs. • Where there are strengths, look at those, to seek out Best Practices which might be possible to replicate in other areas • Where there are weaknesses, they can be addressed those through collaboration and Professional Development initiatives.

  8. Strengths of Achievement Testing • Reliable, valid data. Every student in the province writes the same exam. If others are achieving where we are not, we know that we can improve on that area. • Good fit to many curricular objectives. While they do not tell the whole story, they do tell a useful part of it. • The trick is to find the useful parts; sometimes, the very act of searching for those is an opening for great collaboration and sharing of ‘best practices’ among teachers.

  9. PATs aren’t everything… • But they’re not nothing, either. • Perhaps the most important function of PATs is to help us figure out which questions we should be asking… • ‘Cause any way you cut it, if you teach the curriculum well, the PATs should take care of themselves. When they don’t, we have a problem.

  10. Weaknesses of Achievement Testing • You can’t measure everything that’s important with these. Don’t try. • Uber-Security. Not being able to sit down with the test and look at (and discuss) how the specific outcomes are measured. • Multiple-choice. • Questionable statistical practices; cut-scores, equating, over-analysis…

  11. This data is from one school • Did the school show any changes over this time? • What conclusions can you draw, or what possibilities can you suggest? • Does anything need deeper analysis, or ask for more information? • What else do you need to know about this school?

  12. What do the numbers mean? • In their raw form, not very much. • Consider these 2 schools: • Sunshine School: • 80% acceptable, 20% excellent • Gloomy Academy: • 80% acceptable, 20% excellent • Which one is better?

  13. Analyze the Data • Distribution for Sunshine School:

  14. Analyze the Data • Distribution for Gloomy Academy

  15. Hmm… they don’t look the same anymore… They both have the same statistics… 80% Acceptable 20% Excellent But they’re so different

  16. The problem with ‘cut scores’ • Is that the range is too big. • Cut scores tell us that a student with a 54% is the same as one who scored 86%… acceptable. • Cut scores don’t tell you what kids are good at, or help you improve your instruction.

  17. There 3 types of Assessment: • Assessment of Learning • Assessment for Learning • Assessment for Teaching • PATs are two of these. Trying to make them fit the second purpose is a mistake; neglecting them for the third is an even greater error. If they can’t inform our teaching, then they are of no value to us.

  18. Grading/ Reporting Evaluation Assessment Grading/ Reporting Evaluation Assessment Assessment of Learning • Is what we used to call Summative Assessment. • The process of measuring the Learner’s progress against the pre-determined standard for knowledge or performance • Reflects a judgment regarding the quality or adequacy of student achievement as determined by curriculum standards • Intent is to describe the student’s standing relative to course outcomes, at this point in time.

  19. Proficiency & Outcomes • ‘Power Outcomes’ are those critical outcomes that demand mastery. • Assessing for that takes more time and is more difficult. • You can’t do a good job of that on a PAT … but good assessment during the year should yield concurrent results on the PAT.

  20. Clearing the bar.. • in order to ‘get’ credit for an outcome on the PAT, you only have to answer at the difficulty level of the question. • From that, we can infer ONLY that you can also answer easier questions … for all we know, you may also be able to answer harder ones. • That fact makes ‘equating’ unsupportable; it is based on the premise that the student could not ALSO get the harder question.

  21. Grading/ Reporting Evaluation Assessment Assessment for Learning • Is what we used to call Formative Assessment. • PAT’s don’t do that. One shot. End of year. No do-overs, no learning from mistakes. • But…

  22. For kids, this isn’t necessarily a bad thing. • Sometimes, our system of using tons of accumulated marks throughout the whole year works against kids. • Purpose is everything. What are we trying to assess, what outcomes are we watching, what do we look for as evidence? • What poor assessments ‘hang around’ to mess up the final totals for the year?

  23. How would you grade Jared? • You all remember Jared … he’s the Subway guy. • Jared lost a lot of weight by eating Subway sandwiches and walking every day.

  24. Jared … Before and After

  25. How would you grade Jared? • What grade would Jared get in your ‘lifestyle, nutrition, and DPA class? (assume the ‘before’ and ‘after’ pictures coincide with the timelines for your class)

  26. In most of our schools, Jared • … would not get a great mark. Probably, no more than 50% • Our pre-occupation with calculated ‘Averages’ would ensure that at least some of his marks were low. After all, he only got to be the way he is now in the last month. • We would have to ‘average in’ all of his earlier overweight scores.

  27. Assessment for Teaching • Is another story… • This is formative assessment for us. • Professional Development • Program and curriculum development • What can we do that impacts the performance of our students? • If you do a good job of teaching, the achievement tests take care of themselves.

More Related