1 / 25

DHS English Department Professional Development

DHS English Department Professional Development. Sept 20 8:00-11:00 What are we measuring? How do we measure?. Norms. As a department we will: Promote the exchange multiple perspectives and ideas Work with an eye towards implementation/product

marnie
Download Presentation

DHS English Department Professional Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DHS English DepartmentProfessional Development Sept 20 8:00-11:00 What are we measuring? How do we measure?

  2. Norms • As a department we will: • Promote the exchange multiple perspectives and ideas • Work with an eye towards implementation/product • Be present and engaged (cell phones, computers, etc. only as necessary) • Show respect for others through our actions and words

  3. Learning Targets • By the end of the meeting: • I will be able to : • Communicate what elements compose an effective assessment • Generate criteria of effective assessment • Evaluate my own assessments • Set a plan for reviewing my own assessments

  4. What we generated in Aug.

  5. Where are we? And are we going the right way?

  6. Let’s see where we are: Assess how? • I can translate my learning targets for students and purposes for assessment into dependable assessments that yield accurate results. I am confident of the following: • I can identify key standards of assessment quality in common sense understandable terms • I can develop high-quality selected response/short answer assessments (multiple choice, t/f, matching) • I can develop high quality extended written response assessments • I can develop high quality performance assessments • I can develop high quality personal communication based assessments

  7. Assess how? Continued: I can select among assessment types based on target type and purpose I can sample and appropriately avoid sources of bias and distortion

  8. Assess what? • I can clearly describe the learning targets I want my students to hit. I have done the following: • Outlined in writing the subject matter content knowledge my students are to master • Differentiated content students are to learn outright from content that are to learn to retrieve later through the use of references • Defined in writing the specific patterns of reasoning students are to master • Articulated in writing the performance skills I expect students to learn to demonstrate (where it is the actual learning that counts)

  9. Assess what? Continued… Defined key attributes of products I expect students to create Thought through and defined academic dispositions (school-related attitudes) I hope my students develop Met with other teachers across grade levels to merge my expectations into a vertically, horizontally and diagonally articulated curriculum

  10. Map for the year We are here

  11. Research to support: Marzano Marzano (2003), in his review of research of effective practices in schools, notes the inextricable connection between standards and assessment. He reports that repeated studies reveal the positive impact that formative assessment has on student achievement. Schmoker and Marzano (1999) report that achievement tests scores have increased when schools have linked standards to carefully designed assessments.

  12. Research: Bloom Bloom (1984) notes that careful assessments are essential to tracking student progress and to making adjustments in the delivery of instruction.

  13. Research: Popham and Wiggins and McTighe Popham (2001, 2003) Wiggins (1990, 1993), and Wiggins and McTighe (1998, 1999) report evidence that student achievement is not only measured by carefully designed assessment tools but affected by the assessments that guide teachers and set the learning agenda for students.

  14. Why do we need to do this?

  15. Looking at assessment purpose Read the scenarios with small group Identify what the assessment’s purpose is. List various purposes

  16. What are the criteria for effective assessment? With small group, generate a list of criteria for what defines an effective assessment. Use classifications from scenarios. Align criteria to Stiggins’ 5 keys of quality assessment

  17. Practice with evaluation Read the scenarios with small group Apply criteria: What are the two assessments measuring? What are the strengths of the assessment? What are the weaknesses?

  18. Where do I fall on the continuum? Using our criteria and Stiggins’ rubric, evaluate one of your assessments. You will sit at a table with colleagues who are focusing on the same skill: reading, writing, speaking and listening. You will evaluate your own assessment using the Stiggins’ rubric

  19. Inventory of assessment Targets: Red: not taught, not assessed Yellow: explicitly taught not explicitly assessed Green: explicitly taught and explicitly assessed Where do you line up?

  20. What is next? In terms of skills, what are we consistently and commonly assessing? How should every ______________ be measured or assessed? How can we provide consistent assessment for a student during her four years at DHS?

  21. Housekeeping Attendance IC WERCS sharing sheet Phase theory workshop feedback Conference reflection Brandon

  22. Before you go… Complete white paper and give to Beth

More Related