1 / 48

Student Learning Objectives Assessment Writing Institute July 16-20, 2012

Student Learning Objectives Assessment Writing Institute July 16-20, 2012. Monroe 2 – Orleans BOCES. Agenda. Setting the Context Overview of Reform Agenda What are SLOs? Who Needs an SLO? Assessment Development Review of NYS Test Development Process Regional Test Development Process

ivy
Download Presentation

Student Learning Objectives Assessment Writing Institute July 16-20, 2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student Learning ObjectivesAssessment Writing InstituteJuly 16-20, 2012 Monroe 2 – Orleans BOCES

  2. Agenda • Setting the Context • Overview of Reform Agenda • What are SLOs? • Who Needs an SLO? • Assessment Development • Review of NYS Test Development Process • Regional Test Development Process • Choosing an Item Format • How to write Multiple Choice Questions • How to write Constructed Response Questions

  3. Agenda (continued) • Overview of Assessment Platform (LinkIt) • Item review guidelines • Test Administration • Test Validation • Review of Format for Subject Area Writing Sessions • Set Expectations • Overview of Assessment Platform LinkIT!

  4. Highly Effective School Leaders Fair & Rigorous Accountability Statewide Standards-based Curriculum Demanding Assessments College and Career Ready Students Highly Effective Teachers Regents Reform Agenda • Implementing Common Core standards and developing curriculum and assessments aligned to these standards to prepare students for success in college and the workplace • Building instructional data systems that measure student success and inform teachers and principals how they can improve their practice in real time • Recruiting, developing, retaining, and rewarding effective teachers and principals • Turning around the lowest-achieving schools

  5. as of 3/5/12; subject to revision Components of the New APPR 20 15% 20 25% Teacher: Other points:Individual/Peer Observation*, Student/Parent Feedback*,Student Work*,Teacher Artifacts* * Please refer to the Summary of Revised APPR Revisions 2012-13:http://engageny.org/wp-content/uploads/2012/03/nys-evaluation-plans-guidance-memo.pdf

  6. Key Messages for Student Learning Objectives SLOs name what students need to know and be able to do by the end of the year. SLOs place student learning at the center of the conversation. SLOs are a critical part of all great educators’ practice. SLOs are an opportunity to document the impact educators make with students.

  7. Key Messages for SLOs continued… SLOs provide principals with critical information that can be used to differentiate and target professional development, and focus supports for teachers. The SLO process encourages collaboration within school buildings. School leaders are accountable for ensuring all teachers have SLOs that will support their district and school goals.

  8. SLO Framework

  9. Who needs an SLO? Teacher 1: Those who have a State provided growth measure and are not required to have an SLO. Teacher 2: Those who have a State provided growth measure, and yet, are required to have an SLO because less than 50% of their students are covered by the State provided growth measure. Teacher 3: Those who are required to have an SLO and do not have a State provided growth measure.

  10. Required SLOs: Reference Guide Please see the “Required SLOs: Reference Guide” for NYSED’s rules for teachers who have SLOs for State Growth 10

  11. Test Your Knowledge: State Provided Growth Measure or SLO? Test Your Knowledge: State Provided Growth Measure or SLO? Teacher State Provided Growth Measure or SLO? 5 Grade Common Branch Teacher th 8 Grade ELA Teacher th Elementary Art Teacher - Two 2 grade Art sections with 20 students each; nd - Two 4 grade Art sections with 25 students each; th - One 5 grade Art section with 30 students. th 7 Grade Math and Science Teacher th - Two 7 grade Math sections with 30 students each; th - Two 7 grade Science sections with 25 students each; th - One Advanced 7 grade Science section with 20 students. th High School CTE Teacher - 150 students across 5 sections of Agricultural Science (all use same final assessment) 8 Grade Science Teacher th - One 8 grade Science section with 30 students; th - Four 8 grade Advanced Science sections with 28 students th each.

  12. Test Your Knowledge: State Provided Growth Measure or SLO? Test Your Knowledge: State Provided Growth Measure or SLO? Teacher State Provided Growth Measure or SLO? 5 Grade Common Branch Teacher State Provided Growth SGP/VA th 8 Grade ELA Teacher State Provided Growth SGP/VA th Elementary Art Teacher SLO: - Two 2 grade Art sections with 20 students each; 1 SLO for 4 grade Art sections nd • th - Two 4 grade Art sections with 25 students each; 1 SLO for 2 grade Art sections th • nd - One 5 grade Art section with 30 students. th 7 Grade Math and Science Teacher SLO: th - Two 7 grade Math sections with 30 students each; 1 SLO for 7 grade math (will th • th - Two 7 grade Science sections with 25 students each; receive State provided growth th - One Advanced 7 grade Science section with 20 students. SGP ) th 1 SLO for 7 grade Science • th High School CTE Teacher SLO: - 150 students across 5 sections of Agricultural Science (all 1 SLO for Agricultural Science • use same final assessment) sections 8 Grade Science Teacher SLO: th - One 8 grade Science section with 30 students; 1 SLO for 8 grade Advanced th • th - Four 8 grade Advanced Science sections with 28 students Science sections th each.

  13. Test Development: NYS Process

  14. Regional Test Development Process • Review of Test Specifications: Item Formats (MC/CR), Numbers, and Item Coding Strategy (align to NY CCLS & Depth of Knowledge, e.g. Bloom) • Write Items aligned to Test Specs • Review Items: Item Coding; Item Structure; Style; & Bias/Sensitivity Review • Accept/Reject/Revise Items • Input Items into LinkIt

  15. Regional Test Development Process (continued) • Design Test Forms • Create Test Form Item Map (Combination of Test Specifications and Item Maps create formal Test Blueprint) • Select/Design Rubrics for CR Items • Create Uniform Administration Protocols • Collect Data for Validation

  16. Choosing An Item Format • Most efficient and reliable way to measure knowledge is with MC formats. • Most direct way to measure skill is via performance, but many mental skills can be tested via MC with a high degree of proximity (statistical relation between CR and MC items of an isolated skill). If the skill is critical to ultimate interpretation, CR is preferable to MC.

  17. Choosing An Item Format • When measuring a fluid ability or intelligence, the complexity of such human traits favors CR item formats of complex nature (high-inference) (Haladyna, 1999).

  18. Choosing An Item Format: Conclusions about Criterion Measurement (Haladyna, 1999)

  19. Choosing An Item Format: Conclusions about Criterion Measurement (Haladyna, 1999)

  20. Best Practices for Designing and Grading Assessments • Read Best Practices for Designing and Grading Exams (2005) and share three things you learned with your table group.

  21. Conventional Multiple ChoiceQuestions Three Parts: stem, correct answer, and distractors (Haladyna, 1999). • Stem: Stimulus for the response; it should provide a complete idea of the problem to be solved in selecting the right answer. The stem can also be phrased in a partial-sentence format. Whether the stem appears as a question or a partial sentence, it can also present a problem that has several right answers with one option clearly being the best of the right answers.

  22. Conventional Multiple Choice Questions Three Parts: stem, correct answer, and distractors • Correct Answer: the one and only right answer; it can be word, phrase, or sentence. • Distractors: Distractors are wrong answers. Each distractor must be plausible to test-takers who have not yet learned the content the item is supposed to measure. To those who have learned the content, the distractors are clearly wrong choices. Distractors should resemble the correct choice in grammatical form, style, and length. Subtle or blatant clues that give away the correct choice should be avoided.

  23. Conventional Multiple Choice Questions Design: Review MC Question Types on State Assessments and other Large-Scale Assessments in use, e.g. AP. Review question formats for various grades and subjects. Also review Common Core Sample Questions at http://www.p12.nysed.gov/apda/common-core-sample-questions/ • Review Haladyna Handout: Guidelines for MC Item Writing and discuss with a partner. • Read How Can We Construct Good Multiple-Choice Items? (Cheung & Bucat, (2002) and review with a partner.

  24. Steps to Writing Multiple Choice Questions • Using the content objectives make sure you narrow the focus of the potential question. • Is it about the main idea? (easier) • Is it regarding a significant detail? (more difficult) • Is it inferential? (very challenging) • Write the stem (question). It should be: • A complete concept • Clear and Concise • Reflective of the main idea or a significant detail • Each item only assesses one standard

  25. Steps to Writing Multiple Choice Questions • It should not: • Be obvious or answerable with common prior knowledge • Be dependent on one word • Be written in the negative • Include all of the above, none of the above, or a or b

  26. Steps to Writing Multiple Choice Questions • For Multiple Choice Questions: • Develop 4 Reponses • Write the correct answer first (the key) • Scan the text for possible distracters and develop three wrong answers • All responses should be parallel in construction • Be equal in length (or two short two longer) • Be phrased positively • Be mutually exclusive

  27. High & Low-Inference Constructive Response Questions Key Components of CR Items: • Task: a specific item, problem, question, prompt, or assignment • Response: Any kind of performance to be evaluated, including short/extended answer, essay, presentation, & demonstration • Rubric: scoring criteria used to evaluate responses • Scorers: people who evaluate responses (ETS, 2005)

  28. High & Low-Inference Constructed Response Questions

  29. High & Low-Inference Constructed Response Questions (Haladyna, 1999)

  30. Steps to Writing Constructed Response Questions • For Short Response Questions: • Measures targeted reasoning skill • Task is clearly specified • Question can be answered in allotted time • Avoid choices among several questions • Measures higher order thinking skills (Upper Levels of Bloom’s Taxonomy)

  31. Assessment Platform:LinkIt!

  32. LinkIt! is an assessment and data management platform Tool to design and store our regional assessments NWEA item bank available for ELA, Math, Science and Social Studies What is LinkIt?

  33. Process for LinkIt • Item Banks and Test Banks are already organized by grade level/content area/course. • One person from each course will have access to input the assessment questions during the institute. (Training will be provided) • Assessments will be reviewed prior to giving districts access.

  34. Item Review Guidelines • Requires Content Experts (include General Education, Special Education, and ELL Expertise) • Item Coding: Content (Aligned to correct Learning Standard-See Common Core Exemplars) & Cognition (e.g. Bloom Taxonomy; Webb’s Depth of Knowledge)

  35. Bloom’s Taxonomy

  36. Bloom’s Taxonomy

  37. Webb’s Depth of Knowledge (www.wcer.wisc.edu)

  38. Webb’s Depth of Knowledge

  39. Webb’s Depth of Knowledge

  40. Item Review Guidelines • Item Structure (MC: Stem, Correct Answer, & Plausible Distractors; CR: Clearly identified task, content and verb, e.g. analyze, discuss, & rubrics) • Editing (Conventions of Standard-Written English) • Bias/Sensitivity Review: Joint Standards 7.4

  41. Item Review Guidelines Joint Standards 7.4: “Test Developers should strive to identify and language, symbols, words, phrases, and content that are generally regarded as offensive by members of racial, ethnic, gender, or other groups, except when judged to be necessary for adequate representation of the domain.”

  42. Item Review Guidelines Joint Standards 7.4 Two Issues • Inadvertent use of language that, unknown to test developer, has a different meaning or connotation in one subgroup then in others. • Settings in which sensitive material is essential for validity, e.g. history tests may include material on slavery or Nazis and life sciences may test on evolution.

  43. Test Administration Need To Create Administration Manuals to establish & document testing protocols • Recommendation: use SED guides as primary reference; review guides for AP and other large-scale assessment programs as templates. • Link To SED Manuals: http://www.p12.nysed.gov/apda/manuals/ • Link To AP: http://professionals.collegeboard.com/testing/ap/test-day/instructions

  44. Test Validation: Data Collection & Analysis • Answer Sheet Design & Scanning Procedures (Work w/BOCES & RIC) • Depth of data collection: student demographics; scores; and item level data, if possible • Recommendation: collect data to parallel Title I disaggregation • Evaluate/Revise • Generate trend data • Review against other data to verify & audit rigor • Create Local Technical Manuals

  45. Let’s Get Started! How to Write Test Specifications • Identify Standards to be addressed (be sure to include Common Core Shifts and Standards • For courses with a NYS exam (past or present) review NYSED percentages tested for each standard identified! For all others determine percentages as a group!

  46. Let’s Get Started! How to Write Test Specifications • Determine types of questions for exam and the number of questions per item format • Pre Assessments should be only 40 minutes total in length! (can be administered over multiple sessions if needed!

  47. Process and Procedures for Writing: In Subject Area Groups: • Review and/or develop, and finalize test specifications • Determine Process for writing items • Make sure the items you generate are aligned to NYS standards (CCLS and shifts) • Review items per guidelines • Develop Test Blueprint • Select/Design Rubrics for CR Items • Create Uniform Administration Protocols (See Slides 18-19)

  48. Questions?

More Related