1 / 41

Creating PARCC Assessment Items

Creating PARCC Assessment Items. English Language Arts Boot Camp Grades 6-12 June 2014. Agenda. Overview of Evidence-Based Selected-Response (EBSR) Items and Guidelines Overview of Technology Enhanced Constructed Response Items (TECR) Overview of Prose Constructed Response (PCR) Items

tarter
Download Presentation

Creating PARCC Assessment Items

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Creating PARCC Assessment Items English Language Arts Boot Camp Grades 6-12 June 2014

  2. Agenda • Overview of Evidence-Based Selected-Response (EBSR) Items and Guidelines • Overview of Technology Enhanced Constructed Response Items (TECR) • Overview of Prose Constructed Response (PCR) Items • Creating an EBSR item

  3. Item Guidelines • The PARCC Item Guidelines provide important information about how PARCC items and tasks are constructed. • The information is specific to the assessment, but can be used to create units, formative and summative assessment questions, and text dependent questions for instruction. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  4. Task Generation Models • Only used for the PBA • Identifies the task focus • Provides the number of texts, EBSR, TECR, and PCR Items • Provides the standards to be measured in the PCR and the EBSR and TECR items • Explains the order of the task

  5. Each task generation model has a Task Focus. The task focus indicates the purpose for reading and writing that should drive item development for that task. The reading standards measured by the final prose constructed response link directly to this task focus. The scenario developed for the task must align with this task focus. Each task generation model is labeled with the type of task (Research Task, Literary Analysis Task, Narrative Writing Task) and the model #. In this case, we see the label says “Research Task” and the model # is 8B5PBA. The model # is coded where the first number represents the grade (in this instance 8th grade), the letter reinforces the type of task (where A=Literary Task, B=Research Task, C=Narrative Writing Task, in this instance Research Task) and the final number represents the version of the task, in this case this is the 8th 11th grade research task model developed. The number and type of texts tells how many texts are read by the student when the student completes the task. It also tells whether the text is a literary text or an informational text and whether the text is extended (close in word count to the maximum allowed at a given grade). In the MY and PBA, those texts not designated as extended are assumed to be short or medium in length (may fall between the minimum and middle of the range of word counts for a given grade). There are three task types, Research, Literary Analysis, and Narrative Writing. The Grade may be any grade 3-11. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf Handout # 1

  6. The task generation model indicates the number and type of Prose Constructed Response Items and designates for each prose constructed response in a task model the reading standards that are measured by the prose constructed response items. This information is essential for item developers for three reasons: (1) it indicates how many PCR items are to be written; (2) it indicates the writing focal points for the task; (3) the final PCR item sets the task focus (purpose for reading/writing) and thus the scenario for all tasks generated with the task generation model. The task generation model also includes information about the number and type of EBSR and TECR items. The most important information in this box are the specific reading standards that are to be measured by the EBSR and TECR items for this task generation model. In this example, we see that the EBSR and/or TECR items generated from this task generation model will elicit evidences for student understandings of RH 3,5,6,7, and 8. The information in this box is also found on the forms specifications document for the grade level. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf Handout # 1

  7. Evidence Statements:Reading Literature, Grades 3-11 Grade Claim Standards: RL –Reading Literary

  8. Evidence Statements:Reading Literature, Grades 3-11 Standards divided into evidence statements

  9. Evidence Statements:Reading Informational Texts, Grades 6-11 Standards: In Grades 6 – 11 Literacy Standards forReading History/Social Studies and for Reading Science/Technical are added RH – Reading History/Social Studies RST – Reading Science/Technical

  10. EBSR Items 1. Fixed Elements • Have two or three parts that together serve as a single item. Each part consists of a selected response question with a minimum of four choices. • In the first part, students select the correct answer among a minimum of four choices. In additional parts, students select among a minimum of four choices to demonstrate the ability to locate and/or connect details/evidence from the text that explains or justifies the answer chosen for the first part of the item. • Ensure the quality of distractors reflects current requirements by state assessment programs in PARCC states. • Provide ample time for students to read carefully and in some cases to re-read passages to give students the time for the close analysis and study of text(s). Each item should be designed to take the average student at that grade level no more than 2 minutes to complete the item. • Include sequences of questions that are coherent, reflecting the commitment to unfolding the meaning of complex texts systematically. • Whenever appropriate, use the actual language of the Standards in the items so that they visibly align. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  11. EBSR Items 2. Variable Elements • For those items with one correct response, four answer choices are requisite. For those items with two correct responses (even if the student selects only one), six answer choices are requisite. For those items with three correct responses, even if the student selects only one (allowed only in grades 6-11), seven answer choices are requisite. • Partial credit is allowable and desirable. Items should be designed such that partial credit is granted when students demonstrate evidence of meeting reading standard 1 and one or more additional reading standards. • For those items with one correct response, in Part A and one correct response in Part B, there is no partial credit. • For those items with one or more correct responses in Part A and more than one correct response in Part B, there is partial credit. To receive partial credit, students must answer Part A correctly AND select at least one correct response in Part B. This will earn the student 1 point. For these items, to receive full credit, students must answer both Part A and Part B correctly. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  12. EBSR Items • For those EBSR items with multiple correct responses, where students must choose only one of these responses, students should be notified that there are multiple correct responses, but that they are expected to choose only one. When there is more than one correct response, the number of correct choices needs to be boldface as an emphasis word. • Students should not be asked to guess the number of correct responses as a means to impose additional rigor/difficulty. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  13. EBSR Items Additional Considerations • Items can be written with an inference to be drawn in Part A and a requirement for students to find another example of how that inference applies in a different part of the text for Part B. • Example 1: Part A –What character trait does Character reveal when Character does X? Part B—Which paragraph from the passage best shows additional evidence of this character trait? Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  14. EBSR Part A Taken from http://practice.parcc.testnav.com/#

  15. Part A Item Advances The skills of reading carefully, analyzing deeply, making inferences, and applying understanding of a text are essential for college and career readiness. This Evidence-Based Selected-Response question requires students to infer the purpose of a particular segment of a video. The item advances assessment by asking students to analyze the video for a second instance of this same purpose elsewhere in the video, as demonstrated by the inclusion of a Part B. Taken from http://practice.parcc.testnav.com/#

  16. Part A Answer Choice Rationales • Option B is the correct answer, as the creator of the video includes the segment showing Earhart marrying Putman to enable viewers to see Earhart and a “real” person, someone that is approachable. • Option A is incorrect; although Putman aided Earhart’s endeavors, this clip shows her marriage to him, not their financial arrangements. • Option C is incorrect; the clip shows only one relationship, not Earhart spending a lot of time fostering close relationships. • Option D is incorrect; the clip does not show Earhart influencing others to take on challenges. Taken from http://practice.parcc.testnav.com/#

  17. EBSR Part B Taken from http://practice.parcc.testnav.com/#

  18. Part B Item Advances Part B of this Evidence-Based Selected-Response question is innovative from past approaches in that it allows students to provide an additional example of the purpose of the segment mentioned in Part A. On operational tests, students will be able to click on each option and be directed to the exact moment indicated by the time stamp at the end of each choice, thus showing an innovation in the use of technology. Taken from http://practice.parcc.testnav.com/#

  19. Part A Answer Choice Rationales • Option A is the correct answer. This part of the video shows Earhart trying to be like others and “fit in.” • Option B is incorrect; showing Earhart sitting in the cockpit with her flying partners does not replicate an experience that would make her like other people, as being a female pilot was unusual during her time. • Option C is incorrect; showing Earhart in a parade, signing on to write a book, etc., does not show everyday activities that would make her seem like other people. • Option D is incorrect; not every day is someone greeted by a crowd after flying solo across the Atlantic, so this does not make Earhart like other people. Taken from http://practice.parcc.testnav.com/#

  20. Measuring Vocabulary • The PARCC ELA/Literacy assessments will focus exclusively on measuring Tier II vocabulary. • Items measuring vocabulary interpretation and use must provide sufficient context for students to determine the meaning of the word (or phrase). • Several styles for presenting vocabulary words/phrases are viable. In considering which means is best for presenting the vocabulary words/phrases, item writers should use the means that most efficiently directs students to the word/phrase in the text, while allowing students to see the full relevant context for determining the meaning of the word/phrase. • Items measuring vocabulary should not use qualifiers (e.g. best/most likely). Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  21. Measuring Vocabulary • At G9/10 and G11: RI4.3, RH4.3, and RST4.3 reference “technical meaning of words and phrases in a text” (as opposed to figurative or connotative meanings). Technical words in this context are academic words that take on specific meanings when used in ways that are relevant to a given “technical” subject (i.e. where the line between domain specific and academic words is blurry). • For example, the word “evaluate” is typically an “academic” word, but in the phrase “evaluate this algebraic function,” the word takes on a very technical meaning (i.e. use the algebraic function to solve a problem). Items should not use negative distractors. • Distractors should always be syntactically plausible. This is essential in vocabulary items. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  22. Measuring Vocabulary When writing vocabulary items use the following formats: • Part A - “What is the meaning of the word XXX as it is used in [paragraph 13, or line, or text]?—Part A wording • Part B - “Which of the [details, words or information] from the excerpt helps the reader understand the meaning of XXX? • Unless referencing in this way created a problem for the item. In this case, the item may require the use of a text box approach Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  23. Work Session # 1EBSR Item Alignment Locate Work Session # 1 and the ELA CCSS for Grade 8. Directions: • Read the two text selections, one from New Found Land and one from The Journals of Lewis and Clark. • Review the EBSR Sample Item. • Use the CCSS for Literacy in History/Social Studies to determine what standards this item measures.

  24. TECR Items Fixed Elements Operational items developed using the TECR Reading model must possess the following characteristics: • Delivered and responded to using technology, allowing for a variety of technology-enhanced student responses, including but not limited to the following: • Drag and drop • Highlighting the text • Annotating the text • Other Negotiated Methods • Provide ample time for students to read carefully and in some cases to re-read passages to give students the time for the close analysis and study of text(s). Each item should be designed to take the average student at that grade level no more than 2 minutes to complete the item. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  25. TECR Items Variable Elements • When a TECR uses an EBSR structure with Part A (measuring one or more of standards 2-9) and Part B (measuring standard 1), use the same rules as applied for EBSR. • For other TECR items, partial credit may be offered when an item allows for partial comprehension of the texts to be demonstrated. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  26. TECR Items Best Practices for Distractors for TECR items • The primary purpose of a distractor is to provide evidence that a student is not able to meet the standard(s) assessed due to student misconceptions. • Distractors must be plausible responses to item stems. • Items should not use negative distractors. • The written style of all distractors in an item should be similar to that of the correct response(s), but need not be “perfectly parallel” in length, grammatical function, or use of punctuation. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  27. TECR Items Best Practices for Distractors for TECR items • The content of the distractors, rather than the parallelism of style, is the primary focus for distractor choices. • Answer responses (distractors) are not ordered alphabetically by first word or by short to long, etc. They may be ordered in any sequence as is appropriate to the content measured by the specific item. • If answer responses are quotations from extended passages it is suggested to include the paragraph where the quote was taken. For example: "It is hard to fight with one's heart's desire; whatever it wishes to get, it purchases it the cost of soul." (Paragraph 2) Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  28. TECR Part A Taken from http://practice.parcc.testnav.com/#

  29. Part A Item Advances The skill of rereading carefully to analyze the development of characters in literary texts is important for college and career readiness. This Evidence-Based Selected-Response question allows students to analyze an aspect of a character that is revealed by his actions. In Part A of the item, students are asked to select the character description that is illustrated by lines 38 – 45 of the poem. In Part B of the item, students are asked to find an additional example of the same character trait in another part of the poem. Taken from http://practice.parcc.testnav.com/#

  30. Part A Answer Choice Rationales • The correct response is Option C. Daedalus’stears while fitting his son with wings, Daedalus’s last kiss, and Daedalus’s careful taking of the lead—all these actions show the father as a caring parent. • Option A is incorrect; although it is likely that as a caring parent Daedalus probably worries whether the wings will work, the focus of lines 38-45 is his love for his son. • Option B is incorrect; the opening lines of the poem show that Daedalus is eager to leave Crete, so it is not possible that his tears in lines 38-45 result from departing the island. • Option D is incorrect; Daedalus’s actions in lines 38-45 show solicitousness and care, rather than a sense of pride for the accomplishment of flight. Taken from http://practice.parcc.testnav.com/#

  31. TECR Part B Taken from http://practice.parcc.testnav.com/#

  32. Part B Item Advances The use of technology in Part B helps broaden the scope of this test question. Part B of this “application” Evidence-Based Selected-Response question illustrates one of the key shifts in the CCSS assessment: use of textual evidence. Students are asked, in Part A, to determine an important aspect of a character, and in Part B they are asked to apply this understanding by locating other places in the text that illustrate the same aspect, taking a challenging new direction in assessment.

  33. Part B Answer Choice Rationales • Options 4 and 6 are the correct responses. Each of these citations shows Daedalus engaging in actions that illustrate his love for his son. The first shows his efforts to keep his son safe; the second shows his grief at his son’s death. • Options 1, 2, 3, 5, 6, and 7 are quotations from the poem that move the action forward but do not reveal Daedalus as a caring father. • In Option 1, Daedalus explains his plan to fly out of captivity; • Option 2 gives details about the construction of the wings; • Option 3 shows Daedalus trying out his own wings; • Option 5 reveals Icarus in distress; • Option 7 reveals different aspects of Daedalus’s character. Taken from http://practice.parcc.testnav.com/#

  34. Creating a Writing Prompt 1. Fixed Elements Operational items developed using the Prose Constructed Response item model must possess the following characteristics: • Provide ample time for students to read carefully and in some cases to re-read passages to give students time for close analysis and study of text(s) • Without exception, visibly align questions/tasks with specific Standards; that is, the actual language of the Standards must be used in the prompts/questions • Elicit evidence(s) supporting the Sub Claim for Written Expression and the Sub Claim for Conventions and Knowledge of Language • Elicit evidence(s) aligned with at least one Reading Standard (even when not scored for a sub claim associated with the Major Claim for Reading Complex Text). Allow students to earn partial credit Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  35. Creating a Writing Prompt 1. Fixed Elements Operational items developed using the PCR item model must possess the following characteristics: • Establish a means for scoring student performance • Establish a clear purpose for writing, modeling language found in the Writing Standards. • Specify the audience to be addressed. • State clearly the topic, issue, or idea to be addressed. • Reference the source text (or texts) serving as the stimulus (or stimuli) for a student response. • Specify the desired form or genre of the student response. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  36. Creating a Writing Prompt In addition, prose constructed-response items must provide all students the opportunity to demonstrate a full range of sophistication and nuance in their responses. In other words, prompts must neither be as easy and bland as to elicit superficial writing, nor so challenging that only a small percentage of students are likely to grapple with the prompt effectively. Rather, prose constructed-response items must be designed to elicit meaningful responses on aspects of a text that may be discussed tangentially or in great detail and elaboration, thereby enabling measurement of the full range of student performance. Taken from http://www.parcconline.org/sites/parcc/files/Item%20Guidelines%20%2004%2025%202013%20Version%208_0.pdf

  37. Writing Forms • Locate PARCC Writing Forms in the PARCC section of the binder.

  38. PCR Example Taken from http://practice.parcc.testnav.com/#

  39. Work Session # 2Create an EBSR Item Locate Work Session # 2. Directions • Read the two text selections from Night. • Review the EBSR item guidelines and sample EBSR items. • With your group, write an EBSR item aligned to a RI or RH standard. Re-write them on a sheet of chart paper and hang them on the wall closest to your table.

  40. Resources Common Core Website www.corestandards.org MDE Curriculum Website www.mde.k12.ms.us/ci MDE Common Core Website www.mde.k12.ms.us/ccss MDE Assessment Website www.mde.k12.ms.us/osa MDE SharePoint Website https://districtaccess.mde.k12.ms.us PARCC Website www.parcconline.org

  41. Contact Information Office of Curriculum and Instruction 601.359.2586 commoncore@mde.k12.ms.us Nathan Oakley - Director of Curriculum noakley@mde.k12.ms.us Dr. Marla Davis - Mathematics mdavis@mde.k12.ms.us Vincent Segalini -English/Language Arts vsegalini@mde.k12.ms.us Chauncey Spears - AP/Gifted/SocialStudies crspears@mde.k12.ms.us Robin Lemonis - Early Childhood Literacy/Dyslexia/RtI rlemonis@mde.k12.ms.us

More Related