MCAS-Alt:
This presentation is the property of its rightful owner.
Sponsored Links
1 / 22

MCAS-Alt: Alternate Assessment in Massachusetts Technical Challenges and Approaches to Validity PowerPoint PPT Presentation


  • 105 Views
  • Uploaded on
  • Presentation posted in: General

MCAS-Alt: Alternate Assessment in Massachusetts Technical Challenges and Approaches to Validity. Daniel J. Wiener, Administrator of Inclusive Assessment. University of Maryland – Alternate Assessment Conference October 11-12, 2007.

Download Presentation

MCAS-Alt: Alternate Assessment in Massachusetts Technical Challenges and Approaches to Validity

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Mcas alt alternate assessment in massachusetts technical challenges and approaches to validity

MCAS-Alt:

Alternate Assessment in Massachusetts

Technical Challenges

and Approaches to Validity

Daniel J. Wiener, Administrator of Inclusive Assessment

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Mcas alt alternate assessment in massachusetts technical challenges and approaches to validity

Participation: Thinking Differently About Who Needs an Alternate Assessment

  • MCAS-Alt is intended for

    • Students with significant cognitive disabilities AND

    • Students who focus on attaining grade-level achievement standards, but who cannot fully demonstrate knowledge and skills on the test, even with accommodations

  • State has aligned instruction from lowest level of complexity to grade-level expectations

  • Implications for scoring and reporting results

    • Alternate achievement standards

    • Grade level achievement standards

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Reporting results

Warning

(Failing at Grade 10)

Needs Improvement

Proficient

Advanced

Awareness

Emerging

Progressing

Needs Imp.

Prof.

Adv.

Reporting Results

  • Meaningful performance levels reported for MCAS-Alt, while acknowledging performance is below grade-level expectations

  • A student can attain real proficiency through the alternate assessment based on grade-level achievement standards

Performance Levels

MCAS Test:

MCAS-Alt:

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Mcas alt a structured portfolio

100%

80%

60%

40%

20%

0%

% Accuracy

% Independence

12/1/06

12/2/06

12/3/06

12/4/06

12/5/06

MCAS-Alt: A “structured portfolio”

  • Work samples/video/photo evidence (performance), and data charts (progress) are compiled in an annual portfolio

  • Evidence shows complexity of tasks, and student’s accuracy and independence in performing tasks aligned with required subjects/strands/standards

Data chart

4


Mcas alt alternate assessment in massachusetts technical challenges and approaches to validity

Sometimes, It Seems Like This….

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Mcas alt alternate assessment in massachusetts technical challenges and approaches to validity

Learning

Standards

…It Could Be More Like This…

Entry Points

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Mcas alt alternate assessment in massachusetts technical challenges and approaches to validity

Scoring Criteria

Used to calculate the Performance Level:

  • Completeness of portfolio

  • Level of Complexity (difficulty of standards)

  • Demo of Skills and Concepts (accuracy)

  • Independence (cues/prompts/assistance)

    Plus,

  • Self-Evaluation (monitor, self-correct, reflect)

  • Generalization (varied instructional approaches)

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Mcas alt alternate assessment in massachusetts technical challenges and approaches to validity

MCAS-Alt Scoring Rubric:Demonstration of Skills and ConceptsHow accurate were the student’s responses?

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Mcas alt alternate assessment in massachusetts technical challenges and approaches to validity

MCAS-Alt Scoring Rubric:IndependenceTo what degree were prompts used; How independent were the student’s responses?

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Mcas alt alternate assessment in massachusetts technical challenges and approaches to validity

Setting Performance Levels

  • Use score combinations to describe characteristics of student’s performance: Reasoned Judgment

Example:

LC=3, Acc=4, Ind=3 shows student’s performance is primarily accurate and independent, although below expectations for grade level.

Example:

LC=3, DSC=2, Ind=2 shows student’s performance is limited/inconsistent and student requires frequent prompting/assistance.

University of Maryland – Alternate Assessment Conference

October 11-12, 2007

11


Score combination tables

1

2

3

4

Demo of Skills:

1

Aw

Aw

Aw

Aw

(0

-

25%)

2

Aw

Aw

Em

Em

(26

-

50%)

3

Aw

Em

Pg

Pg

Independence:

(51

-

75%)

4

Aw

Em

Pg

Pg

(76

-

100%)

Score Combination Tables

  • Level of Complexity=2

  • Level of Complexity=3

Demo of Skills:

University of Maryland – Alternate Assessment Conference

October 11-12, 2007

2007 CCSSO Large Scale Assessment Conference Making a Case for MCAS-Alt Validity

12


Score combination tables continued

Demo of Skills:

Independence:

Score Combination Tables (continued)

  • Level of Complexity=4

  • Level of Complexity=5

Demo of Skills:

University of Maryland – Alternate Assessment Conference

October 11-12, 2007

13


Technical validity and reliability some tricky areas for mcas alt

Technical Validity and Reliability: Some Tricky Areas for MCAS-Alt

  • “Test item inter-relationship”

    • But, tasks are selected and/or designed by teachers, and

    • There is little standardization across portfolios

  • “Assessment reflects full range of content standards”

    • But non-regulatory guidance says these students won’t necessarily access all the standards, and

    • Portfolios cannot cover all the standards, only those that were taught

  • Validate that targeted skills shown in the evidence are based on grade-level content standards

    • Is an external alignment study necessary?

  • “Reliability of scores” when responses are so diverse

  • One purpose of MCAS-Alt: Instructional improvement

    • How to document that this occurred?

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Did the mcas alt meet its intended purposes

“Did the MCAS-Alt Meet Its Intended Purposes?”

  • Tell our story:

    • Did the assessment do what we said it would do?

    • If not, how did we fix it?

      This criterion allowed us to document…

  • Whether the student was provided access to curriculum

  • Whether new, challenging skills were taught

  • How well student learned new skills, concepts, content

  • Whether teaching and learning improved as a result of MCAS-Alt

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Document what happened validating the development process

Document What Happened: Validating the Development Process

  • We tried to get the right people at the table

  • We carefully documented all decisions:

    • Determine purpose(s) of the alternate assessment

    • What we want to measure (scoring rubric)

    • Describing the student’s performance (descriptors)

    • Calculating a score (scoring rules)

    • Translating scores into performance levels (standard setting)

    • Where one PL ends and another begins (cut scores)

    • Aligning content and validating the alignment

    • Continuous improvements to the system

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Who contributed to the validation process

Who Contributed to the Validation Process?

  • Curriculum Framework writers served on panels to develop the Resource Guide to the Frameworks for Students with Disabilities

    • Content specialists defined the “essence” of standards and “entry points” at various levels of complexity

  • Special educators pushed them to go lower

  • Diverse stakeholders shared their perspectives

  • Technical advisors helped set performance standards, using reasoned judgment of each “score combination”

  • Contractors told us what others had tried, and what might work

  • Scorers linked the portfolio evidence to the required standard using the Resource Guide, with 94% IRC

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Mcas alt alternate assessment in massachusetts technical challenges and approaches to validity

Resources

MA Department of Education (781-338-3625)

  • Dan Wiener – [email protected]

  • MCAS-Alt Website: www.doe.mass.edu/mcas/alt

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


Mcas alt the evolution of a validity argument

MCAS-Alt:The Evolution of a Validity Argument

Charles A. DePascale

National Center for the Improvement of Educational Assessment

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


The evolution of a validity argument

The Evolution of a Validity Argument

  • Defining the purposes of the assessment

    • Identifying the multiple uses of the assessment and the populations of students

    • Specifying the inferences that would be supported by the assessment

    • Determining that one “set of rules” and procedures would not be sufficient

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


The evolution of a validity argument1

The Evolution of a Validity Argument

  • Designing the system

    • Building checks and balances into the system

    • Documentation:

      • Understanding the extent to which documentation is the system

      • Understanding the importance of documentation of the system

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


The evolution of a validity argument2

The Evolution of a Validity Argument

  • Flexibility and Standardization (Gong & Marion, 2006)

    • Making decisions about where to be flexible and where it is necessary to standardize.

  • Making adjustments to enhance validity

    • Adopting an continual improvement approach

    • Determining when and how to make changes to improve the system.

University of Maryland – Alternate Assessment Conference

October 11-12, 2007


  • Login