revision of the standards for educational and psychological testing overview l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Revision of the Standards for Educational and Psychological Testing : Overview PowerPoint Presentation
Download Presentation
Revision of the Standards for Educational and Psychological Testing : Overview

Loading in 2 Seconds...

play fullscreen
1 / 45

Revision of the Standards for Educational and Psychological Testing : Overview - PowerPoint PPT Presentation


  • 227 Views
  • Uploaded on

Revision of the Standards for Educational and Psychological Testing : Overview. Society for Industrial and Organizational Psychology 25 th Annual Conference, Atlanta, Georgia April 9, 2010 Jo-Ida Hansen University of Minnesota. Presentation: Four Substantive Areas.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Revision of the Standards for Educational and Psychological Testing : Overview' - omer


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
revision of the standards for educational and psychological testing overview

Revision of the Standards for Educational and Psychological Testing: Overview

Society for Industrial and Organizational Psychology

25th Annual Conference, Atlanta, Georgia

April 9, 2010

Jo-Ida Hansen

University of Minnesota

presentation four substantive areas
Presentation: Four Substantive Areas
  • Access – Nancy Tippins
  • Accountability – Laurie Wise
  • Technology – Fritz Drasgow
  • Workplace – Paul Sackett

Revising our Test Standards

joint committee members
Joint Committee Members
  • Barbara Plake, Co Chair
  • Lauress Wise, HumRRO, Co Chair
  • Linda Cook, ETS
  • Fritz Drasgow, University of Illinois
  • Brian Gong, NCIEA
  • Laura Hamilton, Rand Corporation
  • Jo-Ida Hansen, University on MN
  • Joan Herman, UCLA

Revising our Test Standards

joint committee members4
Joint Committee Members
  • Michael Kane, Bar Examiners
  • Michael Kolen, University of Iowa
  • Antonio Puente, UNC-Wilmington
  • Paul Sackett, University of MN
  • Nancy Tippins, Valtera Corporation
  • Walter (Denny) Way, Pearson
  • Frank Worrell, Univ of CA- Berkeley

Revising our Test Standards

scope of revision
Scope of Revision
  • Based on comments each organization received from invitation tocomment
  • Summarized by the Management Committee in consultation with the Co-Chairs
    • Wayne Camara, Chair, APA
    • Suzanne Lane, AERA
    • David Frisbie, NCME

Revising our Test Standards

four substantive areas for r evisions
Four Substantive Areas for Revisions
  • Technology
  • Accountability
  • Workplace
  • Access

Plus attention to format issues

Revising our Test Standards

theme teams
Theme Teams
  • Working teams
  • Cross team collaborations
  • Chapter Leaders
  • Focusing of bringing into chapters content related to themes in coherent and meaningful ways

Revising our Test Standards

timeline
Timeline
  • First meeting January, 2009
  • Projected 4 meetings per year
  • Three year process for completing text of revision
  • Open comment/Organization reviews
    • Projected for December 2010 – April 2011
  • Projected publication Summer, 2012

Revising our Test Standards

revising our test standards access for all examinee populations

Revising our Test Standards:Access for All Examinee Populations

Society for Industrial and Organizational Psychology

25th Annual Conference, Atlanta, Georgia

April 9, 2010

Nancy Tippins

Valtera

overview
Overview
  • Standards related to Access appear throughout many of the chapters but are concentrated in
    • Chapter 9: Testing Individuals of Diverse Linguistic Backgrounds
    • Chapter 10: Testing Individuals with Disabilities
  • Comments on Access were received by the management committee and summarized for the committee charge

Revising our Test Standards

elements of the charge
Elements of the Charge
  • Accommodations/modifications
    • Impact/differentiation of accommodation and modification
    • Appropriateness for English language learners and examinees with disabilities
    • Appropriateness for variety of groups, e.g., pre-K, older populations
    • Flagging
    • Comparability/validity
  • Adequacy and comparability of translations
  • Universal Design

Revising our Test Standards

key access issues included in our charge 1
Key Access Issues Included in our Charge - 1
  • Impact/differentiation of accommodations/modifications
    • What are the appropriate ways to determine or establish the impact of accommodations/modifications on inferences, interpretations, and uses of scores?
    • How do you differentiate clearly between what is an accommodation and what is a modification?

Revising our Test Standards

key access issues included in our charge 2
Key Access Issues Included in our Charge - 2
  • Appropriate ways to accommodate English-language learners and examinees with disabilities
    • Selecting the appropriate accommodation for the individual
      • Who should select the accommodation?
      • What evidence should the selection be based on?
    • Administering the appropriate accommodation
      • What evidence is available to determine impact on test scores, given purpose of the test?
      • how effective is the accommodation?
    • Providing alternative assessments/modified achievement standards

Revising our Test Standards

key access issues included in our charge 3
Key Access Issues Included in our Charge - 3
  • Appropriate ways to accommodate a wider variety of groups
    • Pre-K
    • Older populations
      • Number of older adults with cognitive impairments is rising
      • Tested is often used to determine mental status changes
      • There are many complexities associated with testing this population
        • Combined effects of medical problems, medication side effects, multiple sensory deficits, testing environment

Revising our Test Standards

key access issues included in our charge 4
Key Access Issues Included in our Charge - 4
  • Flagging
    • Current treatment needs to be updated to reflect changes in practice since 1999 standards
    • Most testing organizations no longer flag
    • Decisions about flagging should be based on empirical evidence

Revising our Test Standards

key access issues included in our charge 5
Key Access Issues Included in our Charge - 5
  • Comparability and validity of inferences made based on scores from accommodated or modified tests
    • Foundational issues such as comparability and validity need to be addressed in foundational chapters
    • If sample sizes do not support analyses such as DIF, other evidence of validity should be pursued

Revising our Test Standards

key access issues included in our charge 6
Key Access Issues Included in our Charge - 6
  • Adequacy and comparability of translations (language to language and language to symbol, e.g., Braille)
    • Evidence is needed to demonstrate adequacy of translation and comparability of scores from translated tests
    • Fluency, rather than primary language, should be used to describe target population for a test
    • Quality of translation/adaptation needs to be emphasized
    • Interaction of language proficiency and construct needs to be considered

Revising our Test Standards

key access issues included in our charge 7
Key Access Issues Included in our Charge - 7
  • Universal Design
    • 1999 Standards focus too much on accommodations and modifications and not enough on building accessibility features into design and development process

Revising our Test Standards

revising our test standards issues for accountability

Revising our Test Standards:Issues for Accountability

Society for Industrial and Organizational Psychology

25th Annual Conference, Atlanta Georgia

April 9, 2010

Laurie Wise

HumRRO

overview20
Overview
  • There has been a dramatic expansion of the use of tests for various forms of accountability and other uses related to educational policy-setting.
  • The Joint Committee has been charged with considering how these uses in accountability should impact revisions to the Standards
  • As with the other themes, comments on the standards that related to accountability were compiled by the Management Committee and summarized in their charge to the Joint Committee

Revising our Test Standards

overview21
Overview
  • Standards related to accountability currently are especially relevant to Chapter 13 (Educational Testing and Assessment) and Chapter 15 (Testing in Program Evaluation and Public Policy)
  • Examples of emerging issues associated with use of tests for accountability
    • Test results have important consequences for third parties such as school administrators and teachers, although not always for the examinees themselves.
    • Federal peer review procedures have required assurances of reliability and validity that often go beyond requirements of the current Standards.

Revising our Test Standards

key accountability topics included in our charge
Key Accountability Topics Included in our Charge
  • Validity and reliability requirements
  • Issues with scores, scaling, and equating
  • Policy and practice
  • Formative and interim assessments

Revising our Test Standards

1 validity reliability and reporting issues for accountability
1. Validity, Reliability and Reporting Issues for Accountability
  • Use of a single test (whether or not scores resulting from retesting or repeat testing are sufficient for using more than one score for high stakes decisions) as the sole source of high stakes decisions (e.g., graduation, promotion).
  • How test alignment studies should be documented and used to demonstrate the validity of score interpretations regarding mastery of required content standards.

Revising our Test Standards

1 validity reliability and reporting issues continued
1. Validity, Reliability, and Reporting Issues - continued
  • Provide additional guidance on score accuracy, especially when used to classify individuals or groups into performance regions or other bands on a score scale.
  • Validity and reliability requirements for reporting individual or aggregate performance on subscales (skills or diagnostics).
  • Incorporating error estimates and interpretive guidance in score reports, including subscores and diagnostic reports for individuals and groups.

Revising our Test Standards

2 issues with scores scaling and equating
2. Issues with Scores, Scaling, and Equating
  • Growth modeling, gain scores, and other methods of estimating the value added by teachers and schools.
  • Issues or requirements when linking different assessments (e.g., concordances, linkages and equating)

Revising our Test Standards

3 policy and practice
3. Policy and Practice
  • How to balance privacy concerns for individual examinees, teachers, and administrators while meeting information needs for policy-makers.
  • Issues related to the appropriate role of practice and test preparation, especially in contrast to admissions testing or credentialing.

Revising our Test Standards

4 addressing formative and interim assessments
4. Addressing formative and interim assessments
  • Schools are increasingly developing or purchasing interim or formative assessments to identify study problems well before the end-of-year summative assessments
  • Some issues:
    • Appropriate uses of such tests
    • Validity evidence required for interpreting scores
      • As mastery
      • As predictions

Revising our Test Standards

revising our test standards technological advances

Revising our Test Standards:Technological Advances

Society for Industrial and Organizational Psychology

25th Annual Conference, Atlanta Georgia

April 9, 2010

Fritz Drasgow

University of Illinois

overview29
Overview
  • Technological advances are changing the way tests are delivered, scored, interpreted and in some cases, the nature of the tests themselves
  • The Joint Committee was charged with considering the implications of technological advances for the Standards
  • As with the other themes, comments on the standards that related to technology were compiled by the Management Committee and summarized in their charge to the Joint Committee

Revising our Test Standards

key technology issues included in our charge
Key Technology Issues Included in our Charge
  • Reliability & validity of innovative item formats
  • Validity issues associated with the use of:
    • Automated scoring algorithms
    • Automated score reports and interpretations
  • Security issues for tests delivered over the internet
  • Issues with web-accessible data, including data warehousing

Revising our Test Standards

resources for consideration
Resources for Consideration
  • Guidelines for Computer-Based Testing, Copyright 2002 Association of Test Publishers (ATP)
  • International Guidelines on Computer-Based and Internet Delivered Testing, Copyright 2005 International Test Commission (ITC)

Revising our Test Standards

reliability validity of innovative item formats
Reliability & Validity of Innovative Item Formats
  • What special issues exist for innovative items with respect to access for various groups? How might the standards reflect these issues?
  • What steps should the standards suggest with regards to “usability” of possibly unfamiliar innovative items?

Revising our Test Standards

automated scoring algorithms
Automated Scoring Algorithms
  • What level of documentation/disclosure is appropriate and tolerable for proprietary (i.e. secret) automated scoring algorithms?
  • What sorts of evidence seem most important for demonstrating the validity and “reliability” of automated scoring systems?

Revising our Test Standards

automated score reports and interpretation
Automated Score Reports and Interpretation
  • Use of computer for score interpretations
  • “Actionable” reports (e.g., routing students and teachers to instructional materials and lesson plans based on test results)

Revising our Test Standards

security issues for tests delivered over the internet
Security issues for tests delivered over the internet
  • Issues include:
    • Protecting examinee privacy
    • Threats to validity due to breach of security
    • Are the reported scores correct?
  • Considerations likely to affect standards related to test administration and responsibilities of test users

Revising our Test Standards

web accessible data including data warehousing
Web-Accessible Data, including Data Warehousing
  • Applicability of general technology standards?
    • Security
    • IT standards similar to ISO
  • Revision to commentary vs. drafting additional standards

Revising our Test Standards

revising our test standards issues for work place testing

Revising our Test Standards:Issues for Work-Place Testing

Society for Industrial and Organizational Psychology

25th Annual Conference, Atlanta Georgia

April 9, 2010

Paul Sackett

University of Minnesota

overview38
Overview
  • Standards for testing in the work place are currently covered in Chapter 14 (one of the testing application chapters)
  • Work-place testing includes employment testing as well as licensure, certification, and promotion testing.
  • Comments on standards related to work place testing were received by the Management Committee and summarized in their charge to the Joint Committee.

Revising our Test Standards

key work place testing issues included in our charge
Key Work-Place Testing Issues Included in our Charge
  • Validity and reliability requirements for certification and licensure tests.
  • Issues when tests are administered only to small populations of job incumbents.
  • Requirements for tests for new, innovative job positions that do not have incumbents or job history to provide validity evidence.
  • Assuring access to licensure and certification tests for examinees with disabilities that may limit participation in regular testing sessions?
  • Differential requirements for certification and licensure and employment tests.

Revising our Test Standards

1 validity and reliability requirements
1. Validity and Reliability Requirements
  • Some specific issues:
    • Documenting and communicating the validity and reliability of pass-fail decisions in addition to the underlying scores
    • How cut-offs are determined
    • How validity and reliability information is communicated to relevant stakeholders

Revising our Test Standards

2 issues with small examinee populations
2. Issues with Small Examinee Populations
  • Including:
    • Alternatives to statistical tools for item screening
      • Assuring fairness
      • Assuring technical accuracy
    • Alternatives to empirical validity evidence
    • Maintaining comparability of scores from different test forms

Revising our Test Standards

3 requirements for new jobs
3. Requirements for New Jobs
  • Issues include:
    • Identifying test content
    • Establishing passing scores
    • Assessing reliability
    • Demonstrating validity

Revising our Test Standards

4 assuring access to employment testing
4. Assuring Access to Employment Testing
  • See also separate presentation on fairness
  • Issues include:
    • Determining appropriate versus inappropriate accommodations
    • Relating testing accommodations to accommodations available in the work place

Revising our Test Standards

5 certification and licensure versus employment testing
5. Certification and Licensure versus Employment Testing
  • Currently, two sections in the same chapter
  • Examples of relevant issues:
    • Differences in how test content is identified
    • Differences in validation strategies
    • Differences in test score use
    • Who oversees testing:
      • Private company versus professional board/organization

Revising our Test Standards

questions
Questions?

or Comments?

Revising our Test Standards