MODULE 3
Download
1 / 38

MODULE 3 - PowerPoint PPT Presentation


  • 101 Views
  • Uploaded on

MODULE 3. 3rd. 2nd. 1st. The Backward Design. Learning Objectives. What is the purpose of doing an assessment? How to determine what kind of evidences to look for? What kind of methods can be used? When ? How to create assessment tasks and evaluation criteria ?

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' MODULE 3' - oya


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

MODULE 3

3rd

2nd

1st



Learning Objectives

  • What is the purpose of doing an assessment?

  • How to determine what kind of evidences to look for?

  • What kind of methods can be used? When?

  • How to create assessment tasks and evaluation criteria?

  • How to make sure the assessment is valid and reliable?


Why to assess?

The assessment purpose is to measure understanding, not to generate grades!!!!

Provide professor with:

  • Reliable information to infer about student learning

  • Feedback to improve their teaching methods

Provide students with:

  • Feedback on how well they understand the content

  • Feedback to improve their learning


How to create assessments?

Assessment Objectives

1

Evidences of Learning

2

Assessment

3

Evaluation Criteria

4

Validity and Reliability

5


Assessment Objectives

1

Worth to be familiar with

Superficial knowledge

Important to know and do

Big Ideas

Core Concepts


Evidences of Learning

2

Related to the ABILITY of doing something

EVIDENCE refers to something that can be DEMONSTRATED!

Know Concepts,

Definitions

Worth to be familiar with

Micro-descriptive

Superficial knowledge

Ability to apply a

specified framework

to contexts

approached in class

Important to know and do

Big Ideas

Micro, domain-specific

Core Concepts

Ability to transfer

knowledge to different

contexts

Macro, across domains, multi-disciplinary


Evidences of Learning

2

Bloom 2001

Judge results of concepts application and make decision about the quality of the application

Apply concepts to situations different from the ones approached in class. Create new application or interpretation of the concepts

Break concepts into parts and understand their relationship

Apply concepts to situations similar to the ones approached in class

Summarize ideas, explain concepts

Recall Definitions


Assessment

3

Assessment Tasks

When to assess?

Which Method?


When to assess?

3

Snapshot

vs

Photo Album

Summative

Formative + Summative


Formative and Summative Assessment

3

Both are necessary! At least 50% of each!

Formative

Summative

  • More focused on grade

  • End of the grading period. There is no opportunity to adjust and show improvement

  • Objective is give feedback to students

  • Build learning

  • Students can adjust

Combination of both leads to a good result!

F

S

F


Continuous Assessment

3

Different Moments and Different Methods!!


Assessment Tasks

3

Worth to be familiar with

Traditional Quizzes and Tests

Superficial knowledge

Important to know and do

  • Paper-and-pencil

  • Multiple-Choice

  • Constructed response

Big Ideas

Core Concepts

Performance and Task Projects

  • Complex

  • Open-Ended

  • Authentic

Adapted from “Understanding by Design”, Wiggins and McTighe


Assessment Tasks

3

Bloom 2001

Result of Analysis - Decision

Pros vs Cons, Cost vs Benefits, Reflection

Complex Performance Task

Authentic Tasks

Application to new contexts and situations, create artifact or project

Analytical Task

Experiments, Scenarios Simulation, Cases

Simple Performance Task

Straightforward application, Exercises

Open-Ended Questions

Quizzes and Traditional Tests

Ask about definition


Authentic Task

3

Task that reflects possible real-world challenges

It is a performance-based assessment!

  • Is realistic contextualized

  • Replicates key challenging real-life situations

  • Requires judgment and innovation

  • Students are asked to “do” the subject

  • Assesses students ability to integrateconcepts and ideas

  • Gives the opportunity to practice and get feedback

It is problem-based NOT an exercise!

From “Understanding by Design”, Wiggins and McTighe


Authentic Task vs. Exercise

3

Authentic Task

Exercise

  • Question is “noisy” and complicated

  • Variousapproaches can be used

  • Right approach

  • Integration of concepts and skills

  • Appropriate solution

  • Right solution and answer

  • Arguments is what matters

  • Accuracy is what matters

Out of class, summative

In Class, Formative

From “Understanding by Design”, Wiggins and McTighe


How to formulate an Authentic Task?

3

G

What is the goalof the task?

What is the problem that has to be solved?

oal

R

What is the student role?

What students will be asked to do?

ole

A

Who is the audience?

Who is the client? Who students need to convince?

udience

What is the situation or the context?

What are the challenges involved?

S

ituation

P

erformance

S

tandards

From “Understanding by Design”, Wiggins and McTighe


Evaluation Criteria

4

Must…

  • Provide feedback for students

  • Be clear

  • Communicated in advance

  • Be consisted of independent variables

  • Focus on the central cause of performance

  • Focus on the understanding and use of the Big Idea


Types of Evaluation Criteria

4

Criteria

Check List


Check List

4

There are two types of Check Lists

1. List of questions and their correct answers

2. List of individual traits with the maximum points associate to each of them


Check List: Questions and answers

4

This type is used to Multiple-choice, True/False, etc. In other words, where there is a correct answer

  • A

  • C

  • D

  • B

  • B

  • D


Check List: Traits and their value

4

Trait 1

Weight (%) or points

Performance

Trait 2

Weight (%) or points

Trait ...

Weight (%) or points

Grade = weighted average or Grade = sum of points


Analytic Rubric is better

4

  • Provides more detailed feedback for students

  • Provides students with information about how they

  • will be evaluated

  • Is clearer

  • Evaluates independently each characteristic that

  • composes performance

On the other hand…

  • Holistic Rubric is used when it is required only an overall impression


Analytic Rubrics

4

How to create them?


How to create Analytical Rubrics?

4

Example: a simple rubric to evaluate an essay

Levels of achievement

Excellent

Satisfactory

Poor

Traits

Ideas

Organization

Grammar


It can be created from a Check List!

4

The difference is that each trait is broken down into levels of achievement, which have detailed description!

Excellent

Trait 1

Weight (%) or points

Acceptable

Unacceptable

Excellent

Performance

Trait 2

Weight (%) or points

Acceptable

Unacceptable

Excellent

Trait ...

Weight (%) or points

Acceptable

Unacceptable


How to define traits?

4

It can be defined based on experience or on historical data:

Get samples of students’ previous work

1.

Classify the sample into different levels (strong, middle, poor…) and write

down the reasons

2.

Cluster the reasons into traits

3.

Write down the definition of each trait

4.

Select among the samples the ones that illustrate each trait

5.

Continuously refine the traits’ definitions

6.

It can also be defined based specific objectives and learning questions

From “Understanding by Design”, Wiggins and McTighe


How to build Analytic Rubric?

4

The following website is a free tool that helps to create rubrics

http://rubistar.4teachers.org/index.php


Validity and Reliability

5

Validity

Reliability


Validity and Reliability

5

Target

Desired understandings / objectives

Shots

Assessment Outcomes

http://ccnmtl.columbia.edu/projects/qmss/images/target.gif


Checking for Validity

5

Self-assess the assessment tasks by asking yourself the following questions:

  • Is it possible to a student do well on the assessment task, but really not

  • demonstrate the understandings you are after?

  • Is it possible to a student do poorly, but still have significant understanding of the

  • ideas? Would this student be able to show his understanding in other ways?

If yes, the assessment is not valid. It does not provide a good evidence to make

any inference

(Note: for both questions, consider the task characteristics and the rubrics used for evaluation)

Adapted from “Understanding by Design”, Wiggins and McTighe


Checking for Validity

5

The previous questions can be broken down into more detailed questions:

How likely is that a student could do well on the assessment by:

  • Making clever guesses based on limited understanding?

  • Plugging in what was learned, with accurate recall but limited understanding?

  • Making a good effort, with a lot of hard work, but with limited understanding?

  • Producing lovely products and performance, but with limited understanding?

  • Applying natural ability to be articulated and intelligent, but with limited

  • understanding?

Next Slide

From “Understanding by Design”, Wiggins and McTighe


Checking for Validity

5

How likely is that a student could do poorly on the assessment by:

  • Failing to meet performance goals despite having a deep understanding of the Big

  • Ideas?

  • Failing to meet the grading criteria despite having a deep understanding of the Big

  • Ideas?

Make sure all the answers are “very unlike” !!!

From “Understanding by Design”, Wiggins and McTighe


Checking for Reliability

5

Assess rubric reliability by asking:

  • Would different professors grade similarly the same exam?

  • Would the same professor give the same grade if he grades the test twice, but at

  • different moments?

Assess task reliability by asking:

  • If a student did well (or poorly) in one exam, would he do well (or poorly) in a similar

  • exam?

Task reliability can be achieved by applying continuous assessments

From “Understanding by Design”, Wiggins and McTighe


Summary

Observable,

Demonstrable

Learning

Objectives

Evidences

of Learning


Summary

Evidences of Learning

Time

Formative Assessment Tasks

Summative

Assessment

Task

  • Complexity depends on the desired level of understanding

  • Clear evaluation criteria (Rubrics)

  • Task and criteria must provide accurate and consistent judgments


Learning Objectives

  • What is the purpose of doing an assessment?

  • How to determine what kind of evidences to look for?

  • What kind of methods can be used? When?

  • How to create assessment tasks and evaluation criteria?

  • How to make sure the assessment is valid and reliable?


References

  • The main source of information used in this module is the following book

  • Wiggins, Grant and McTighe, Jay. Understanding by Design. 2nd Edition. ASCD, Virginia,

  • 2005.

  • Rubrics

  • http://rubistar.4teachers.org/index.php


ad