making data work for title i directors
Download
Skip this Video
Download Presentation
Making Data Work for Title I Directors

Loading in 2 Seconds...

play fullscreen
1 / 67

Making Data Work for Title I Directors - PowerPoint PPT Presentation


  • 194 Views
  • Uploaded on

Making Data Work for Title I Directors. Using Achievement Data to Effectively Inform Both Targeted Assistance and Schoolwide Programs 9 th Annual Title Programs Conference June 14 – 17, 2011. Agenda . Title I Evaluation Requirements Purpose of Evaluation Understanding Assessment

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Making Data Work for Title I Directors' - mercury


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
making data work for title i directors

Making Data Work for Title I Directors

Using Achievement Data to Effectively Inform Both Targeted Assistance and Schoolwide Programs

9th Annual Title Programs Conference

June 14 – 17, 2011

agenda
Agenda
  • Title I Evaluation Requirements
  • Purpose of Evaluation
  • Understanding Assessment
  • Detecting Differences
  • Measuring Outcomes
  • Relationships Between Variables
  • Practical Applications for Schoolwide and Targeted Assistance Programs
  • Questions
federal government program assessment4
Federal Government Program Assessment
  • Program Assessment Rating Tool (PART)*
    • 25 questions to measure effectiveness of all federally funded programs on the following:
      • Clarity of purpose and design
      • Strategic planning
      • Management
      • Results
    • Classifies programs as Effective, Moderately Effective, Adequate, or Ineffective

*Information obtained from ExpectMore.gov

federal government program assessment5
Federal Government Program Assessment

*Data obtained from ExpectMore.gov

federal government program assessment6
Federal Government Program Assessment
  • Title I Results:
    • Moderately Effective*
      • Ambitious goals
      • Well-managed
      • Likely need to improve efficiency or address other problems in program’s design or management
    • http://www.whitehouse.gov/omb/expectmore/summary/10003320.2006.html

*Information obtained from ExpectMore.gov

title i data and assessment requirements
Title I Data and Assessment Requirements
  • Comprehensive Needs Assessment
    • Must be based on academic achievement information about all students in the school, including:
      • Racial and ethnic groups
      • Children with disabilities
      • Children with limited English proficiency
      • Children with low incomes
      • Migrant students
    • Information must reflect achievement relative to the state standards
title i data and assessment requirements9
Title I Data and Assessment Requirements
  • Comprehensive Needs Assessment
    • Help the school understand the subjects and skills for which teaching and learning need to be improved
    • Identify specific academic needs of students and groups of students who are not yet meeting state standards
title i data and assessment requirements10
Title I Data and Assessment Requirements
  • Evaluation
    • Schools operating schoolwide programs must annually evaluate the implementation of and results achieved by Title I programs using data from state assessments and other indicators of academic achievement.
    • Determine whether the program has increased achievement of students, particularly students who had been farthest from achieving standards
    • Results should guide any revisions to the program plan
title i data and assessment requirements11
Title I Data and Assessment Requirements
  • Evaluation of Program Components:
    • Schoolwide reform strategies
    • Instruction by highly qualified teachers
    • Parental Involvement
    • Additional support
    • Transition plans
purpose of evaluation
Purpose of Evaluation
  • Accountability
  • Objectivity
  • Comparability
  • Decision-Making tool
  • Allocation of resources
  • Determine areas of need
  • Measure progress
understanding assessment14
Understanding Assessment
  • Operationally defining constructs of interest
    • Academic Achievement
    • Intelligence
    • Attitudes toward school
    • Parental Involvement
    • Attendance
    • Disabilities/Learning Disorders
    • Language ability
    • Access to services
understanding assessment15
Understanding Assessment
  • Commonly used approaches to measurement:
    • Standardized tests
    • IQ tests
    • Surveys
    • Checklists
    • Rating scales
    • Structured interviews
    • Count variables
understanding assessment16
Understanding Assessment
  • Test Theory
    • Complete/Perfect measurement is not possible
    • Items are drawn from an infinite pool that represents complete information for any construct
    • Collection of items provides an observed score
    • TRUE score = observed score + error
      • Goal of assessment: systematically minimize error to consistently detect meaningful measures that are as close to “True” score as possible
understanding assessment17
Understanding Assessment
  • Test Theory
    • Items should discriminate; Item Characteristic Curve (ICC)-plots ability vs. probability of correct response
understanding assessment18
Understanding Assessment
  • Reliability
    • Degree of consistency of measurement
  • Validity
    • Degree of accuracy & representativeness of measurement
  • Instrument can have high reliability but low validity
  • Valid instrument must be reliable
understanding assessment19
Understanding Assessment
  • Sampling distributions
    • Each observed score is a sample statistic drawn from a distribution of multiple observations (actual or potential)
understanding assessment20
Understanding Assessment
  • Confidence intervals
    • More accurate in terms of interpreting and communicating results
understanding assessment21
Understanding Assessment
  • Interpreting results
    • Multiple measures always best
    • Keep the goal of testing in mind
    • External contributing factors (testing conditions)
    • Population considerations (culture, language, etc.)
    • Consider subscales when reported
      • Provide data on specific domains of interest
      • Useful for focused evaluation
understanding assessment22
Understanding Assessment
  • Descriptive Statistics
    • Produce quantitative summaries of numbers
    • Describes a population or sample
    • Central tendency
    • Variability
    • Linearity/Non-linearity
  • Inferential Statistics
    • Hypothesis testing
    • Allows for prediction and examining real world relationships (cause & effect)
    • T-test
    • ANOVA
    • Linear modeling
detecting differences24
Detecting Differences
  • Why is it important to understand group differences?
    • Achievement gaps
    • Differences in availability and/or usability of resources
    • Title programs are aimed at reducing these gaps for academically at-risk students
  • Why is it important to use data to examine differences?
    • Objectivity
    • Consistency
    • Accuracy
    • Monitoring growth, improvement, changes over time
detecting differences25
Detecting Differences
  • Group differences in evaluation terms
    • Basic differences in groups or populations
      • T-test for statistical significance (compare group means for significant variability)
    • Differences in patterns and relationships
      • ANOVA, Linear tests (compare direction, association, and slope or rate of change in groups)
detecting differences27
Detecting Differences
  • Using various data sources to determine meaningful differences
    • Multiple assessments, informants, data points over time, etc.
    • Provides a clear, overarching view
    • Compare measures to:
      • Validate your group comparisons
      • Negate or shed light on weaknesses from a single source
detecting differences28
Detecting Differences
  • Disaggregate
    • Slicing data to see what the picture looks like for different subgroups hidden within an average or basic percentage
    • Break down statistics into smaller components based on groups you are interested in comparing
      • Schools within a district
      • Teachers within a school
      • Race/ethnicity
      • ELL status
      • Disability status
      • Free/Reduced lunch status
detecting differences29
Detecting Differences
  • Disaggregate
    • Start with simple statistics: averages, percentages

Average Score: 143

African American

121

Native

American

132

Asian

154

Latino

127

White

160

detecting differences30
Detecting Differences
  • Cross-Tabluate
    • Dicing data to further examine group differences; allows for multiple group categories to be considered at once
    • Useful for demonstrating how an education system advantages/disadvantages different groups of students, and how the situation might be improved
detecting differences31
Detecting Differences

Percent Passing 8th Grade Math Test =50%

measuring outcomes33
Measuring Outcomes
  • Why is it important to measure outcomes?
    • Program evaluation
    • Improve student programs
    • Fiscal responsibility
  • Why is it important to use data to monitor outcomes?
    • Objectivity
    • Consistency
    • Accuracy
    • Monitoring growth, improvement, changes over time
measuring outcomes34
Measuring Outcomes
  • Measuring outcomes in evaluation terms
    • Basic pre-post comparison
      • T-test
      • Antecedent monitoring and process evaluation
    • Comparison of relationships between outcomes
      • Linear models
      • Do outcomes vary for different groups or students?
    • Reporting outcomes
      • Raw score change
      • Percentage change
measuring outcomes36
Measuring Outcomes
  • Longitudinal Data
    • Looking at data over a period of time (weeks, months, years)
    • Observing multiple time points can point out important patterns that cannot be detected with one or two measurements
measuring outcomes37
Measuring Outcomes

Student Achievement on State Tests: A Longitudinal Analysis

relationships between variables39
Relationships Between Variables
  • Why is it important to understand relationships between variables?
    • Identify patterns
    • Knowledge about what to expect for students
    • Identify areas to target for improvement
  • How does data help us examine relationships between variables?
    • Measurements represent constructs of interest
    • Objective evidence to support claims and provide ideas
    • CANNOT identify causes of outcomes
relationships between variables40
Relationships Between Variables
  • Relationships in Evaluation Terms
    • Scatterplots
    • Track scores or outcomes across levels of another variable to uncover connections
    • Allow us to examine real-world connections and understand relationships, form hypotheses
    • Allow us to visualize impact of cutoff scores and group criteria
relationships between variables41
Relationships Between Variables

Table/List Data

Scatterplot

relationships between variables43
Relationships Between Variables
  • Important considerations
    • Type of variables (continuous, no groups or categories)
    • Scales of variables
    • Cutoffs
    • Context of analysis, potential outcomes or decisions
    • Limitations
practical applications45
Practical Applications
  • Example 1: Combining Data Sources to Identify Group Differences for Hispanic Students
  • Strategy:
    • Is there an achievement gap for Hispanic students in my system? Is it higher or lower than those at the state or national level?
      • Analyze data from various sources to provide support/rationale for your Title I plan
        • Multiple levels (local, state, national)
        • Multiple measures (different assessments)
        • Over time where available
practical applications46
Practical Applications
  • Examine Longitudinal National Data
    • Large sample, over time
    • Trend in NAEP mathematics average scores for 9-year-old students, by race/ethnicity*

*Data obtained from NAEP Long Term Trend Data

practical applications47
Practical Applications
  • Trend in White – Hispanic NAEP mathematics average scores and score gaps for 9-year-old students*

*Data obtained from NAEP Long Term Trend Data

practical applications48
Practical Applications
  • State 4th Grade CRCT Scores – Percentage of Students at Each Performance Level: Comparison by Race/Ethnicity*
    • DNM
    • Meets
    • Exceeds

*Data obtained from GaDOE 2009-10 State Report Card

practical applications49
Practical Applications
  • Average Math ACT Score for H.S. Seniors by Subgroups at the State and National Levels
    • State
    • Nation

*Data obtained from GaDOE 2009-10 State Report Card

practical applications50
Practical Applications
  • LEA 4th Grade CRCT Scores – Percentage of Students at Each Performance Level: Comparison by Race/Ethnicity*
    • DNM
    • Meets
    • Exceeds

*Data obtained from GaDOE 2009-10 LEA Report Card

practical applications51
Practical Applications

School Level 2009-2010 AYP Academic Performance Data*

*Data obtained from GaDOE 2009-10 LEA Report Card

practical applications52
Practical Applications
  • Examine population level statistics when useful
  • Examine differences across multiple outcome measures
  • Examine differences across age groups
  • Consider group sizes when interpreting and comparing data
practical applications53
Practical Applications
  • Example 2: Monitoring Outcomes of Supplemental Educational Services (SES) for School Districts
  • Strategy: Do students participating in SES show improvement over Non-SES students? Does SES contribute to improved academic achievement?
    • Compare outcomes of SES students to Non-SES students
practical applications54
Practical Applications

Average Scaled Scores on Achievement Outcomes Statewide

*Achievement Outcome Measures include CRCT, EOCT, and/or GHSGT

practical applications55
Practical Applications

Percentages of Students Meeting or Exceeding Standards on Achievement Outcomes Statewide

*Achievement Outcome Measures include CRCT, EOCT, and/or GHSGT

practical applications56
Practical Applications

Percentages of Students Moving to Higher Performance Category on Achievement Outcomes Statewide

*Achievement Outcome Measures include CRCT, EOCT, and/or GHSGT

practical applications58
Practical Applications
  • Consider multiple dimensions and various relevant criteria when comparing data
  • Examine useful data embedded within larger data points
  • Consider group size
  • Consider growth over time where appropriate
practical applications59
Practical Applications
  • Example 3: Examining Achievement Data to Make Decisions About Title I Programs
  • Strategy: Is there a relationship between student level variables and achievement outcomes on state tests?
    • CRCT Reading Scores vs. CRCT Math Scores
practical applications63
Practical Applications
  • Important to consider cutoffs for evaluation
  • Grouping affects data patterns
  • Useful diagnostic tool to visualize outcomes and relationships that exist between student variables
practical applications64
Practical Applications
  • Data Resources
    • Annual AYP Reports
    • State Test Reports
    • Online Assessment System (OAS)
    • GaDOE System Report Cards
    • NAEP National Report Card Data
    • System-level Data
    • School-level Data
contact information
Contact Information

Jessica Johnson

Operations Analyst

Office of School Improvement

Georgia Department of Education

[email protected]

(404)657-9864

slide67
Thank you for attending this session!

Your feedback is valuable.

Please return your evaluation form.

ad