1 / 58

Oakland Schools, October 2012

M athematics A ssessment P roject. Assessment in the Service of Learning: the roles and design of highstakes tests Hugh Burkhardt MARS: Mathematics Assessment Resource Service Shell Center, University of Nottingham and UC Berkeley. Oakland Schools, October 2012. Structure of this talk.

camila
Download Presentation

Oakland Schools, October 2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mathematics Assessment Project Assessment in the Service of Learning: the roles and design of highstakes testsHugh BurkhardtMARS: Mathematics Assessment Resource ServiceShell Center, University of Nottingham and UC Berkeley Oakland Schools, October 2012

  2. Structure of this talk A word on the Common Core The roles of assessment Tasks and tests Task difficulty and levels of understanding SBAC (and PARCC) Computer-based testing Testing can be designed to serve learning.

  3. Content: Getting Richer Practices: Much deeper and richer

  4. The Practices in CCSS-M: Make sense of problems and persevere in solving them. Reason abstractly and quantitatively. Construct and critique viable arguments Model with mathematics Use appropriate tools strategically Attend to Precision Look for and make use of structure Look for and express regularity in repeated reasoning.

  5. The Roles of Assessment The traditional view: Tests are “just measurement”, “valid” and “reliable” (if a little strange looking) Reality: tests have three roles: Measuring a few aspects of math-related performance Defining the goals by which students and teachers are judged Driving the curriculum This implies a huge responsibility on those who test

  6. High-stakes assessment implicitly • Exemplifies performance objectives • For most teachers, and the public, test tasks are assumed to exemplify the standards – so they effectively replace them • Determines the pattern of teaching and learning • FACT: most teachers ‘teach to the test’perfectly reasonable “bottom line” Taking the standards seriously implies designing tests that meet them: “Tests worth teaching to” that enable all students to show what they can do

  7. WHAT YOU TEST IS WHAT YOU GET

  8. Mathematical Practices “Proficient students expect mathematics to make sense. They take an active stance in solving mathematical problems. When faced with a non-routine problem, they have the courage to plunge in and try something, and they have the procedural and conceptual tools to carry through. They are experimenters and inventors, and can adapt known strategies to new problems. They think strategically”. CCSSM How far do our current tests assess this? Not far?

  9. Tasks and Tests

  10. Levels of mathematical expertise It is useful to distinguish task levels, showing increasing emphasis on mathematical practices. • Novice Tasks Short items, each focused on a specific concept or skill, as set out in the standardscf ELA spelling, grammar • Apprentice Tasks Rich tasks with scaffolding, structured so that students are guided through a “ramp” of increasing challenge • Expert Tasks Rich tasks in a form they might naturally arise – in the real world or in pure mathematicscf ELA writing

  11. Task examples

  12. Some Expert Tasks Tasks that are not predigested. Problems as they might arise: in the world outside the math classroomin really doing math

  13. Expert TasksTraffic Jam 1. Last Sunday an accident caused a traffic jam 11 miles long on a two lane highway. How many cars do you think were in the traffic jam? Explain your thinking and show all your calculations. Write down any assumptions you make.(Note: a mile is approximately equal to 5,000 feet.) 2. When the accident was cleared, the cars drove away from the front, one car from each of the lanes every two seconds. Estimate how long it took before the last car moved.

  14. Airplane turnaround How quickly could they do it?

  15. Ponzi Pyramid Schemes Max has just received this email From: A. Crook To: B. Careful Do you want to get rich quick? Just follow the instructions carefully below and you may never need to work again: 1. Below there are 8 names and addresses. Send $5 to the name at the top of this list. 2. Delete that name and add your own name and address at the bottom of the list. 3. Send this email to 5 new friends.

  16. Ponzi continued • If that process goes as planned, how much money would be sent to Max? • What could possibly go wrong? Explain your answer clearly. • Why do they make Ponzi schemes like this illegal? This task involves Formulating the problem mathematically Understanding exponential growth Knowing it can’t go on for ever, and why

  17. PYTHAGOREAN TRIPLES

  18. PYTHAGOREAN TRIPLES (3, 4, 5), (5, 12, 13), (7, 24, 25) and (9, 40, 41) satisfy the condition that natural numbers (a, b, c) are related by c2= a2+ b2 Investigate the relationships between the lengths of the sides of triangles which belong to this set Use these relationships to find the numerical values of at least two further Pythagorean Triples which belong to this set. Investigate rules for finding the perimeter and area of triangles which belong to this set when you know the length of the shortest side.

  19. Which sport? task from “the literature”, 1982 Which sport will give a graph like this? Describe in detail how your answer fits the graph – as in a radio commentary

  20. Table tiles Maria makes square tables, then sticks tiles to the top. Square tables have sides that are multiples of 10 cm. Maria uses quarter tiles at the corners and half tiles along edges. How many tiles of each type are needed for a 40 cm x 40 cm square? Describe a method for quickly calculating how many tiles of each type are needed for larger, square table tops.

  21. Apprentice tasks • Expert tasks with added scaffolding to: • ease entry • reduce strategic demand • Ramp of difficulty within the task, with increasing: • complexity • abstraction • demand for explanation Balanced Assessment in Mathematics (BAM) tests are of this kind – complementing state test (novice)

  22. Apprentice tasks: design Guide students through a ramp of challenge “Patchwork” gives: Multiple examples that ease understanding Specific numerical cases to explore – counting A helpful representation – the table only then Asks for a generalization – rule, formula Presents an inverse problem A step in growing expertise: “climbing with a guide”

  23. Task Difficulty The difficulty of a task depends on various factors: • Complexity • Unfamiliarity • Technical demand • Autonomy expected of the student • Expert Tasks fully involve the mathematical practices and all four aspects, so must not be too technically demanding • Apprentice Tasks involve the mathematical practices at a modest level, with little student autonomy • Novice Tasks present mainly technical demand, so this can be “up to grade”, including concepts and skills just learnt

  24. Levels of understanding Imitation Retention Explanation chains of reasoning (2nd sentence? ) Adaptation requires non-routine problems Extension offer opportunities Jean Piaget

  25. The Practices in CCSS-M: Make sense of problems and persevere in solving them. Reason abstractly and quantitatively. Construct and critique viable arguments Model with mathematics Use appropriate tools strategically Attend to Precision Look for and make use of structure Look for and express regularity in repeated reasoning.

  26. These haven’t been a focus of testing … but they will be – maybe

  27. Smarter Balanced Assessment Consortium http://www.k12.wa.us/smarter/ (Just google SBAC)

  28. Here are some of the headlines.

  29. SMARTER Balanced “content spec” Claim #1 Concepts & Procedures “Students can explain and apply mathematical concepts and interpret and carry out mathematical procedures with precision and fluency.” Claim #2 Problem Solving “Students can solve a range of complex well-posed problems in pure and applied mathematics, making productive use of knowledge and problem solving strategies.” Claim #3 Communicating Reasoning “Students can clearly and precisely construct viable arguments to support their own reasoning and to critique the reasoning of others.” Claim #4 Modeling and Data Analysis “Students can analyze complex, real-world scenarios and can construct and use mathematical models to interpret and solve problems.” PARCC so far seems less specific; mainly CCSSM content standards

  30. Total Score for Mathematics Content and Procedures Score 40% Problem Solving Score 20% Communicating Reasoning Score 20% Mathematical Modeling Score 20%

  31. So: A large part of the exam will be devoted to things we haven’t tested before. but– there is THE CAT

  32. Computer-based testing Promises of cheap instant adaptive testing Great strengths and, even after 70 years, weaknesses Key questions: for rich tasks does CBT provide Effective handling of the testing process? Better ways for presenting tasks? A natural medium for students to work on math? Effective ways to capture a student’s reasoning? Reliable ways to score a student’s response? Effective ways for collecting and reporting results?

  33. Computer-based testing: summary Best way to manage high-stakes testing Fine on its own for Novice level tasks (short items) Expert and Apprentice tasks essentially involve: • long chains of autonomous student reasoning • sketching and doodling: diagrams, numbers, equations This needs • image capture (paper, scan, or ? off tablet screen) • human scoring (on screen) responses too diverse for computer Can improve testing in various ways; for analysis see Educational Designer lead article in Issue 5, out soon

  34. SBAC test structure Three components planned: CAT: computer-adaptive on-line test “set of rich constructed response items” “a classroom-based performance task”(up to 2 periods) Task types: extended examples in content spec PARCC also has “end of course” CAT + “periodic assessments” during year – nature open to “creative input” by educators and vendors

  35. SBAC and PARCC some impressions and commentsonplans and challenges Some seem desperate to stick with Computer-based testing

  36. Here’s a sample PARCC “modeling” item.

More Related