1 / 80

Assessment Literacy for the Middle Level Educator

Assessment Literacy for the Middle Level Educator. Jennifer Borgioli Learner-Centered Initiatives, Ltd. Organizational Focus. Assessment to produce learning… and not just measure learning.

elyse
Download Presentation

Assessment Literacy for the Middle Level Educator

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Literacy for the Middle Level Educator Jennifer Borgioli Learner-Centered Initiatives, Ltd.

  2. Organizational Focus Assessment to produce learning… and not just measure learning.

  3. Do you honestly want to know what X exactly is? Is your life going to be improved by momentarily knowing what x is? No. Absolutely not. This whole problem is a conspiracy against hardworking American students. Let me tell you, solving for X right now is not going to stop the recession. It fact, it’s not going to do anything. And another thing. When have you ever had to know what is X is in your long esteemed professional career? Exactly. This is a futile attempt for “educators” in this district to boast of their student’s success rate. I am going to go the rest of my life not knowing what X is. Because what is X when you really think about it? A letter, the spot, two lines crossing each other. I don’t think anyone will ever really know what X truly is because the essence of X is beyond our brain potential. In conclusion, Harry S. Truman’s middle name was just the letter S, not an actual name. Now that is a letter that’s actually being utilized. See, you learned something, and it was not because of this logarithm. The End.

  4. Talking about the science of our profession does not discredit the art.

  5. “Less than 20% of teacher preparation programs contain higher level or advanced courses in psychometrics (assessment design) or instructional data analysis.” Inside Higher Education, April 2009

  6. Implications Minimize interruptions. Make them worthy.

  7. To be assessment savvy….

  8. Essential Element #2 2f. The degree to which the middle-level educational program includes ongoing Standards-based assessments

  9. Assessment Definition: The strategic collection of evidence of student learning. (Martin-Kniep, 2009) Assessment: test as dogs: pitbull A thing and a process

  10. “And while the exams may be a thoroughly vetted, sophisticated means of measurement, they are an inadequate, constricted form of expression.” March 12, 2013 SEATTLE’S LOW-STAKES TESTING TRAP Posted by Michael Guerriero Read more: http://www.newyorker.com/online/blogs/newsdesk/2013/03/seattles-low-stakes-testing-trap.html#ixzz2NWx8bv00

  11. What are the implications of chasing the pineapple?

  12. “Standards-Based Assessment” RL.05.06a: Recognize and describe how an author’s background and culture affect his or her perspective. (NYS)

  13. Assessment considerations

  14. 1999 APA Testing Standards

  15. “The higher the stakes of an assessment’s results, the higher the expectation for the documentation supporting the assessment design and the decisions made based on the assessment results.” (Section 13)

  16. Traditional Assessment Performance-Based Assessment

  17. Performance-Based Assessments (PBAs) A performance task is an assessment that requires students to demonstrate achievement by producing an extended written or spoken answer, by engaging in group or individual activities, or by creating a specific product. (Nitko, 2001)

  18. PBA’s versus TraditionalLiskin-Gasparro (1997) and Mueller (2008)

  19. PBA’s versus TraditionalLiskin-Gasparro (1997) and Mueller (2008)

  20. Validity = Accuracy

  21. How do we ensure alignment and validity in assessment?Degrees of Alignment

  22. Goal is Best Fit

  23. If you want to assess your students’ ability to perform, design, apply, interpret. . . . . . then assess them with a performance or product task that requires them to perform, design, apply, or interpret.

  24. I cannot claim my assessment is valid if I do not have some type of blueprint

  25. Minimum

  26. Basic

  27. Articulated

  28. How many?3-5 3 – 5 standards in a PBA (reflected in rows in the rubric) 3 – 5 items per standard on a traditional test

  29. Reliability = Consistency

  30. I cannot claim my assessment is reliable if I do not have statistics to support my claim

  31. Reliability Indication of how consistently an assessment measures its intended target and the extent to which scores are relatively free of error. Low reliability means that scores cannot be trusted for decision making. Necessary but not sufficient condition to ensure validity.

  32. Three Types of Measurement Error Subject effect Test effect Environmental effects

  33. Subject Effects

  34. Others… Fatigue Sleep deprivation Illness Disability

  35. Score Testing Fatigue Test Familiarity Bias Score

  36. In what ways do we knowingly (or unknowingly) contribute to measurement error?

  37. Test Effects

  38. Examples Not enough space for a response Confusing items Typos Misleading (or lacking) directions Scorer inconsistencies

  39. 10. Format the item vertically instead of horizontally. From A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment by Haladyna, Downing, and Rodriguez

  40. 21. Place choices in logical or numerical order. Students should not have to hunt to find an answer. Answers should be provided in a logical, predictable pattern.

More Related