Measuring teachers use of formative assessment
This presentation is the property of its rightful owner.
Sponsored Links
1 / 43

Measuring teachers' use of formative assessment:  PowerPoint PPT Presentation


  • 70 Views
  • Uploaded on
  • Presentation posted in: General

Measuring teachers' use of formative assessment: . A learning progressions approach. Why formative assessment?.

Download Presentation

Measuring teachers' use of formative assessment: 

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Measuring teachers use of formative assessment

Measuring teachers' use of formative assessment: 

A learning progressions approach

Brent Duckor (SJSU)

Diana Wilmot (PAUSD)

Bill Conrad & Jimmy ScheRrer (SCCOE)

Amy Dray (UCB)


Why formative assessment

Why formative assessment?

  • Black and Wiliam (1998) report that studies of formative assessment (FA) show an effect size on standardized tests between 0.4 and 0.7, far larger than most educational interventions and equivalent to approximately 8 months of additional instruction;

  • Further, FA is particularly effective for low achieving students, narrowing the gap between them and high achievers, while raising overall achievement;

  • Enhancing FA would be an extremely cost-effective way to improve teaching practice;

  • Unfortunately, they also find that FA is relatively rare in the classroom, and that most teachers lack effective FA skills.


Measuring effective fa practice toward a cycle of psychometric inquiry

Measuring Effective FA practice: Toward a cycle of psychometric inquiry

  • Define constructs in multi-dimensional space

  • Design items and observations protocol

  • Iterate scoring strategy in alignment construct maps

  • Apply measurement model to validate teacher and item-level claims and to warrant inferences about effective practice


Teachers who practice assessment for learning know and can

Ask an expert

teachers who practice assessment for learning know and can

  • understand and articulate in advance of teaching the achievement targets that their students are to hit;

  • inform their students about those learning goals, in terms that students understand;

  • translate classroom assessment results into frequent descriptive feedback

  • for students, providing them with specific insights as to how to improve;

  • continuously adjusting instruction based on the results of classroom assessments.


Research suggests

FA 1.0

Research suggests


Good formative feedback

FA 2.0

Good formative feedback

  • Types of feedback (Butler, 1998; Butler & Neuman, 1995; Hattie & Timperley, 2007; Kluger & DeNisi, 1996);

  • Level of specificity and task relatedness (Tunstall & Gipps, 1996; Ames, 1992; Dweck, 1986);

  • The “next steps” required of students (Butler & Neuman, 1995) can influence the effectiveness of formative assessment on classroom learning.


The zone of study

The Zone of study


Measuring teachers use of formative assessment

Construct Map

Items Design

Reliability

Validity

Measurement

Model

Outcome Space


Phase 1 construct mapping

Phase 1: Construct Mapping


Knowledge of formative assessment

Knowledge of formative assessment

Bin

Strategic

Tag

Probe

Bounce

Pause

Propositional

Pose


Mapping a route

FA 3.0

Mapping a Route …

Expert teaching

Novice teaching


Skills allocation at soliciting responses level

Skills Allocationat “Soliciting responses” level


Measuring teachers use of formative assessment

Item Design

Construct Map

Reliability

Validity

Measurement

Model

Outcome Space


Phase 2 items design

Phase 2: Items Design

  • Align items (tasks, prompts, scenarios) with the levels on the construct map.

  • Building items to map onto a refined set of formative assessment practices.

  • Consider various item types and delivery platforms.

  • Review an example of a simulated scenario with focus on turns of talk.

  • Consider content and construct validity, as well as inter-rater reliability in construction and use of items.


Consider the context the lesson cycle

Consider the context: The Lesson Cycle

  • Initial framing of a question within a HIGH LEVEL Task.

  • Study the enactment of formative assessment implementation within a science classroom engaged in HL tasks in ecology.


The evolution of a task

The evolution of a Task

The Enacted Task

Stein, Remillard, & Smith (2007)


Delivery platforms for data collection

Delivery Platforms for data collection

  • Traditional “paper and pencil” questions

  • Classroom based “authentic” tasks

    • Lesson planning/enactment/reflections

    • Lesson study of best formative assessment practices

  • Innovative adaptive virtual scenarios (AVS)

    • Video episodes

    • Web-based virtual platforms


Adaptive virtual scenarios

Adaptive VIRTUAL scenarios

  • Ability to share a range of novice through expert formative assessment practice in video format.

  • Pause videos throughout teacher enactment to measure teachers level of sophistication.


Potential items within simulated scenarios

Potential items within Simulated SCENARIOs

  • Pause video and ask the teacher:

    • How would you characterize this initial move?

    • What do you notice about the teachers’ questioning strategy?

    • How might you compare one questioning strategy with another?

    • What would you do next?

  • Replay video and show teachers’ follow up moves.

  • Pause and ask the teacher what they think:

    • How would you negotiate this student response?

    • What kind of question would you pose next?


Sample of teacher responses

Sample of teacher responses

  • Initial move was “literal”, ”more open ended”.

  • Teacher’s response to students was:

    • “too directed towards the right answer”,

    • “thoughtfully provoking misconceptions”,

    • “open enough to provoke deeper conceptual knowledge across the classroom”

  • I would have asked a question like,”…[a question that uses student thinking as a basis]”


Administration of tools

Administration of tools

  • Administer tools with widest range of teachers possible

    • pre-service

    • Induction years

    • Veteran 5-9 years

    • Veteran 10+ years

  • Use adaptive technology to capture teacher’s zone of proximal development in their expertise of formative assessment practices

  • Examine relationship (if any) between scores on multiple tools


Measuring teachers use of formative assessment

Items Design

Construct Map

Reliability

Validity

Measurement

Model

Outcome Space


Phase 3 defining the outcome space

Phase 3: Defining the outcome space

  • Link each item response back to the levels on the construct map.

  • Using scoring guides to capture granularity of formative assessment practices.

  • Consider various types e.g. rubrics, observation protocols, coding

  • Provide exemplars of practice to assist in scoring protocols.

  • Consider content and construct validity, as well as inter-rater reliability in construction and use of scoring guides.


Problem of practice formative assessment

Problem of Practice:Formative assessment

  • Teachers have difficulty scaffolding student thinking and reasoning through discourse

  • As a result, the cognitive demand of a task often declines during implementation (e.g., TIMSS, QUASAR)


A tool for measuring teachers use of most formative assessment

A Tool for Measuring Teachers’ use of most-formative assessment

Scherrer & Stein (In Press)

Initiate participation in classroom discussions

Respond to students contributions during a discussion


An example of how to apply the codes in context

An Example of How to Apply the Codes in Context

  • Teacher: What can you tell me about this shape?

  • Juan: It has 4 right angles.

  • Teacher: What else can you tell me?

  • Kayla: It is a rectangle.

  • Teacher: Okay, a rectangle. Why do you think it is a rectangle?

  • Kayla: It has 4 sides.

  • Teacher: Are all shapes that have 4 sides rectangles?

  • Yasmin: It could also be a quadrilateral.

  • Teacher: Wait. I am asking if all shapes with 4 sides are rectangles.

  • Launch

  • Collect

  • Repeat, Uptake

  • Uptake-Literal

  • Terminal, Reinitiate


Scoring the codes

Scoring the Codes

+1

+0

+0

+1

  • Teacher: What can you tell me about this shape?

  • Juan: It is a quadrilateral.

  • Teacher: What else can you tell me?

  • Kayla: It is a rectangle.

  • Teacher: Okay, what else?

  • Yasmin: It has four right angles.

  • Teacher: Okay, what about this shape over here? What can you tell me about this one?

  • Launch

  • Collect

  • Collect

  • Launch

In this example, the teacher did not “do” anything with the student responses.


Scoring the codes1

Scoring the Codes

+1

+1

+2

  • Teacher: What can you tell me about this shape?

  • Juan: It is a quadrilateral.

  • Teacher: What else can you tell me?

  • Kayla: It is a rectangle.

  • Teacher: Okay, Juan said this shape is a quadrilateral, and Kayla said it is a rectangle. What is similar about quadrilaterals and rectangles?

  • Launch

  • Collect

  • Connect

In this example, the teacher asked an open-ended question, gathered an additional response to that question, and then connected the two responses.


Connecting codes to response levels on the construct map

Connecting codes to response levels on the construct map

Response Levels

Scoring designations

L4

L3

L2

L1

L0

  • Launch, Collect, Connect, Uptake

  • Launch, Collect, Uptake

  • Launch, Collect, Literal

  • Launch, Literal, Literal

  • Literal, Literal, Literal


Measuring teachers use of formative assessment

Items Design

Construct Map

Reliability

Validity

Measurement Model

Outcome Space


Phase 4 applying measurement models

Phase 4: applying measurement models

  • Cross reference qualitative construct maps with technically calibrated Wright Maps using IRT.

  • Employ person and item statistics to check on model fit

  • Consider various types of measurement models including facets.

  • Provide individual and group level data on progress.

  • Consider “internal structure” validity evidence for construct maps, in addition to checks on reliability of tools.


Berkeley evaluation and assessment research center

Berkeley Evaluation And Assessment Research Center

Item response theory can model a “learning progression” within a particular domain. For example:

  • KSC: Knowledge of science content

  • KST: Knowledge of student thinking

  • KFA: Knowledge of formative assessment


Berkeley evaluation and assessment research center1

Berkeley Evaluation And Assessment Research Center


Case study developing an integrated assessment system dias for teacher education

Case study: Developing an integrated assessment system (DIAS) for teacher education

Pamela Moss, University of Michigan

Mark Wilson, University of California, Berkeley

Goal is to develop an assessment systemthat:

  • focuses on teaching practice grounded in professional and disciplinary knowledge as it develops over time;

  • addresses multiple purposes of a broad array of stakeholders working in different contexts; and

  • creates the foundation for programmatic coherence and professional development across time and institutional contexts.


Case study dias

Case study: DIAS

The research team identified the ways in which student teachers can learn to use formative and summative assessment to guide their students’ learning.

  • Developed a construct map that outlined a progression of learning in “Assessment.”

  • Described the different aspects of “Assessment,” such as:

    • Identifying the mathematical target to be assessed;

    • Understanding the purposes of the assessment;

    • Designing appropriate and feasible tasks (such as end of class checks);

    • Developing accurate inferences about individual student and whole class learning.


Case study dias1

Case study: DIAS

  • The research team collected data from student teachers enrolled in the elementary mathematics teacher education program at the University of Michigan.

  • Designed scoring guides based on our construct map.

  • Coded videotapes from over 100 student teachers as they conducted lessons in the classroom.

  • Coded associated collected data, such as lesson plans and reflections, since these documents contain information about what the student teachers hope to learn from the assessment(s), and what they infer about the students in their classroom.

  • Using item response methods to determine which aspects of assessment practice are easier or more difficult for the student teachers and to thereby inform the teacher education program.


Synopsis

Synopsis

  • Incredible partnership

  • Filling an important educational research space

  • Identified the assessment space

  • Focus on the content

  • Emphasis on student thinking

  • Contributions to instrumentation and methodology

  • Marry qualitative and quantitative data using IRT framework

  • Next steps: pilot study


Contact information

Contact Information

Bill ConradJimmy Scherrrer

Santa Clara County Office of [email protected]

408-453-4332 (Office)

510-761-2007 (Cell)

[email protected]

Brent Duckor, Ph.D.Diana Wilmot, Ph.D.

Assistant ProfessorCoordinator, Research & Evaluation

College of EducationPalo Alto Unified School District

San Jose State [email protected]

[email protected]

Amy Dray, Ph.D.

UC Berkeley Graduate School of Education

[email protected]


Measuring teachers use of formative assessment1

Measuring teachers' use of formative assessment: 

A learning progressions approach

Brent Duckor (SJSU)

Diana Wilmot (PAUSD)

Bill Conrad & Jimmy ScheRrer (SCCOE)

Amy Dray (UCB)


  • Login