slide1 l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
AERA April 2005 PowerPoint Presentation
Download Presentation
AERA April 2005

Loading in 2 Seconds...

play fullscreen
1 / 14

AERA April 2005 - PowerPoint PPT Presentation


  • 100 Views
  • Uploaded on

Models and Tools for Drawing Inferences from Student Work: The BEAR Scoring Engine. Cathleen Kennedy & Mark Wilson University of California, Berkeley. AERA April 2005. Overview. Features of “complex” tasks. How PADI addresses complex task features.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'AERA April 2005' - thu


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Models and Tools for Drawing Inferences from Student Work:The BEAR Scoring Engine

Cathleen Kennedy & Mark Wilson

University of California, Berkeley

AERA April 2005

overview
Overview
  • Features of “complex” tasks.
  • How PADI addresses complex task features.
  • The “big” assessment picture and where inferences are drawn (in measurement models implemented in Scoring Engine).
  • An example of the PADI and Scoring Engine views.
  • Next steps: Wizard to guide designers in developing solid chain of reasoning.

2

© BEAR Center, 2005

example from foss
Example from FOSS

3

© BEAR Center, 2005

example from foss4
Example from FOSS

Two measures:

Physics (speed)

Mathematics

5 responses:

Equation choice

Fill-in Numbers

Fill-in Units

Calculate

Units in Answer

Are the responses dependent?

4

© BEAR Center, 2005

complex task features
“Complex” Task Features
  • Multiple measures of interest
    • Content & inquiry
    • Multiple aspects of inquiry
  • Response dependencies
    • Common stimulus
    • Sequence of steps

5

© BEAR Center, 2005

complex measurement requires clear chain of reasoning
Complex Measurement Requires Clear Chain of Reasoning

Inferences one wishes to draw (cognition vertex)

Evidence required to draw the inferences (interpretation vertex)

Observations required to generate evidence (observations vertex)

Inferences are then interpretable in the context of the purpose of the assessment

6

© BEAR Center, 2005

padi addresses complex task features
PADI Addresses “Complex” Task Features

PADI Approach:

Multidimensional IRT measurement model.

Well-defined evaluation phases model response dependencies (rather than ignoring them).

  • Multiple measures of interest
    • Content & inquiry
    • Multiple aspects of inquiry
  • Response dependencies
    • Common stimulus
    • Sequence of steps (within task)

7

© BEAR Center, 2005

assessment system architecture

Task

Specifications

Student

Database

Delivery

System

Assessment System Architecture

Design

Implementation

Design System

Students

Design Team

Scoring

Engine

8

© BEAR Center, 2005

chain of inferential reasoning

Task

Specifications

Chain of Inferential Reasoning

Assessment Purpose

Student

Database

Assessment Evidence

Design System

Delivery

System

Students

Design Team

Scoring

Engine

Inferences about what students know and can do

9

© BEAR Center, 2005

foss example padi view
FOSS Example: PADI View

Two student model variables:

Physics (speed)

Mathematics

6 observable variables:

Equation choice (Physics)

Fill-in Numbers (Physics)

Fill-in Units (Physics)

Calculate (Math)

Units in Answer (Physics)

Bundled physics items

10

© BEAR Center, 2005

foss example bundling rules
FOSS Example : Bundling Rules
  • Defined in PADI Design System
    • Template (task specification)
      • Activity
        • Evaluation Procedure
          • Evaluation Phase
  • Implemented in Delivery System

11

© BEAR Center, 2005

foss example bundling rules12
FOSS Example : Bundling Rules

12

© BEAR Center, 2005

foss example scoring engine view
FOSS Example : Scoring Engine View

Two student model variables:

Physics (speed)

Mathematics

2 observable variables:

Math score

Bundled Physics score

“Between Item” MD:

Each observable variable provides evidence of one SMV.

Scoring Engine returns two proficiency estimates per student to the Assessment Delivery System.

13

© BEAR Center, 2005

next steps
Next Steps

Develop Measurement Model Design Wizard

  • Evaluate design needs of users (how do they do it now, what would work better?)
  • Guide thinking from the “assessment purpose” standpoint
  • Align inferences, evidence and observations

14

© BEAR Center, 2005