Consequential validity
Download
1 / 19

Consequential Validity - PowerPoint PPT Presentation


  • 1140 Views
  • Updated On :

Consequential Validity Mountain Plains Regional Resource Center Teleconference Series May 30, 2007 Elizabeth Towles-Reeves Jacqui Kearns National Alternate Assessment Center (NAAC) Validity Should be Central

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Consequential Validity' - benjamin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Consequential validity l.jpg

Consequential Validity

Mountain Plains Regional Resource Center

Teleconference Series

May 30, 2007

Elizabeth Towles-Reeves

Jacqui Kearns

National Alternate

Assessment Center (NAAC)


Validity should be central l.jpg
Validity Should be Central

  • We argue that the purpose of the technical documentation is to provide data to support or refute the validity of the inferences from the alternate assessments at both the student and program level.

  • Learning from Kane, we view the validity evaluation as building an argument (e.g., legal, philosophical) to support or refute the inferences regarding the scores from the AA-AAS.


The challenge of documenting technical quality of alternate assessments l.jpg
The Challenge of Documenting Technical Quality of Alternate Assessments

  • heterogeneity of the group of students being assessed and how they demonstrate knowledge and skills;

  • often “flexible” assessment experiences;

  • relatively small numbers of students/tests;

  • evolving view/acceptance of academic curriculum (i.e., learning experiences;

  • the high degree of involvement of the teacher/assessor in administering the assessment (Gong & Marion, 2006);

  • non-traditional assessment approaches (e.g., portfolios, performance tasks/events) for which there is a need to expand the conceptualization of technical quality to evaluate these approaches (Linn, Baker, & Dunbar, 1991).


Expanding technical quality l.jpg
Expanding Technical Quality Assessments

  • Linn, et al. (1991) pointed out that we already (15 years ago!) have the theoretical tools for expanding validity investigations, but in practice validity is usually viewed too narrowly.

    • Content frameworks are described, and specifications for the selection of items are provided for standardized achievement tests. Correlations with other tests and sometimes with teacher assessments of achievement may also be presented. Such information is relevant to judgments of validity but does not do justice to the concept (p. 16).

  • We argue that AA-AAS technical evaluations are suffering the same fate.


Shepard 1993 l.jpg
Shepard (1993) Assessments

  • Shepard (1993) advocated a straightforward means to prioritize validity questions. Using an evaluation framework, she proposed that validity studies be organized in response to the questions:

    • What does the testing practice claim to do;

    • What are the arguments for and against the intended aims of the test; and

    • What does the test do in the system other than what it claims, for good or bad? (Shepard, 1993, p. 429).

  • The questions are directed to concerns about the construct, relevance, interpretation, and social consequences, respectively.

  • We believe that this approach for prioritizing our questions is useful.


Slide6 l.jpg

INTERPRETATION Assessments

OBSERVATION

COGNITION

The Assessment Triangle and Validity Evaluation

  • VALIDITY EVALUATION

    • Empirical Evidence

    • Theory and Logic (argument)

    • Consequential Features

  • Reporting

  • Alignment

  • Item Analysis/DIF/Bias

  • Measurement Error

  • Scaling and Equating

  • Standard Setting

  • Assessment System

  • Test Development

  • Administration

  • Scoring

  • Student Population

  • Academic Content

  • Theory of Learning


Questions l.jpg
Questions Assessments

  • After reviewing the materials on the website from the Inclusive Assessment Seminars, what questions do you have before we begin looking at the consequential validity features of AA-AAS systems?


What is consequential validity l.jpg
What is Consequential Validity? Assessments

  • Messick (1989) originally introduced consequences to the validity argument. Later, Shepard (1993, 1997) broadened the definition by arguing one must investigate both positive/negative and intended/unintended consequences of score-based inferences to properly evaluate the validity of the assessment system.


So what l.jpg
So What? Assessments

  • There is overwhelming support for answering the “So What” question (Haertal, 1999; Kane, 2002; Kleinert et al., 2001; Lane & Stone 2002; Shepard, 1997), but at the same time differing stakeholder views must be included to present a convincing validity argument (Lane & Stone, 2002; Linn 1998; Ryan, 2002).


Intended consequences l.jpg
Intended Consequences Assessments

  • Lane and Stone (2002) suggest that state assessments are intended to impact:

    • Student, teacher, and administrator motivation and effort;

    • Curriculum and instructional content and strategies;

    • Content and format of classroom assessments;

    • Improved learning for all students;

    • Professional development support;

    • Use and nature of test preparation activities; and

    • Student, teacher, administrator, and public awareness and beliefs about the assessment, criteria for judging performance, and the use of assessment results.


Unintended consequences l.jpg
Unintended Consequences Assessments

  • At times, however, Lane and Stone (2002) propose unintended consequences are possible such as:

    • Narrowing of curriculum and instruction to focus only on the specific learning outcomes assessed;

    • Use of test preparation materials that are closely linked to the assessment without making changes to the curriculum and instruction;

    • Use of unethical test preparation materials; and

    • Inappropriate use of test scores by administrators.


Consequential validity evaluation questions l.jpg
Consequential Validity AssessmentsEvaluation Questions

  • Before you consider investigating any consequential validity questions for your alternate assessment judged against alternate achievement standards (AA-AAS), you must determine:

    • What is the purposeof the AA-AAS?

    • How will the scores of the AA-AAS be used?

    • What stakeholders are important to helping you understand the consequences of the AA-AAS: students, parents, teachers, administrators, community members, experts?


Consequential validity evaluation questions13 l.jpg
Consequential Validity AssessmentsEvaluation Questions

  • Once you determine purpose and use, you may then ask:

    • What are the intended and unintended consequences based on the purpose and use of the AA-AAS?

    • Are the intended and unintended consequences positive or negative?


Looking to our past to prepare for the future l.jpg
Looking to our Past Assessmentsto Prepare for the Future

  • Research on the consequential validity of alternate assessments from the perspective of:

    • Students/Parents

      • Research Questions:

        • What benefits to students have accrued from the participation in AA-AAS?

        • What is the extent to which students have accessed the general education curriculum?

        • What is the impact of the AA-AAS on students’ IEP development?

        • What is the relationship between student performance in AA-AAS and post-school life outcomes?

        • What student, teacher, and instructional variables influence parents’ perceptions regarding the AA-AAS?


Looking to our past to prepare for the future15 l.jpg
Looking to our Past Assessmentsto Prepare for the Future

  • Research on the consequential validity of alternate assessments from the perspective of:

    • Teachers

      • Research Questions:

        • What benefits to teachers have accrued from the participation of students in the AA-AAS?

        • What is the extent to which alternate assessments are a part of daily classroom routine?

        • What is the relationship between alternate assessment scores and the amount of time spent working on the assessment?

        • To what extent do teacher and instructional variables predict alternate assessment scores?

        • Which student, teacher, and instructional variables influence teachers’ perceptions regarding the AA-AAS?

        • What is the impact of the AA-AAS on teachers’ daily instruction?


Looking to our past to prepare for the future16 l.jpg
Looking to our Past Assessmentsto Prepare for the Future

  • Research on the consequential validity of alternate assessments from the perspective of:

    • School

      • Research Questions:

        • To what extent are students included in the accountability process?

        • Is there any relationship between student performance in the AA-AAS and student performance in the general assessment?


Prioritization l.jpg
Prioritization Assessments

  • There is no way that a state can take on investigating all these research studies at once.

  • How could you go about prioritization of studies?

    • Gather stakeholders

    • Work through guided discussion regarding what studies may be important to conduct based on stakeholder input

    • Prioritize the top 2 studies for the short-term (next 2-3 years) and then prioritize the top 2 studies for the long-term (next 3-5 years)


Questions18 l.jpg
Questions Assessments

  • Any questions or clarifications?


Questions19 l.jpg
Questions Assessments

  • Have you gathered a stakeholder group to think about and prioritize consequential validity questions for the short-term and long-term? If so, who was involved? If not, what stakeholders do you think will be important to have at the table?

  • What consequential validity studies have you performed in your state (i.e., looking at consequences related to students, teachers, and schools)?

  • What studies would you like to perform but are unsure as to how to gather the data or conduct the study?


ad