Making data work for title i directors
Download
1 / 67

Making Data Work for Title I Directors - PowerPoint PPT Presentation


  • 193 Views
  • Updated On :

Making Data Work for Title I Directors. Using Achievement Data to Effectively Inform Both Targeted Assistance and Schoolwide Programs 9 th Annual Title Programs Conference June 14 – 17, 2011. Agenda . Title I Evaluation Requirements Purpose of Evaluation Understanding Assessment

Related searches for Making Data Work for Title I Directors

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Making Data Work for Title I Directors' - mercury


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Making data work for title i directors l.jpg

Making Data Work for Title I Directors

Using Achievement Data to Effectively Inform Both Targeted Assistance and Schoolwide Programs

9th Annual Title Programs Conference

June 14 – 17, 2011


Agenda l.jpg
Agenda

  • Title I Evaluation Requirements

  • Purpose of Evaluation

  • Understanding Assessment

  • Detecting Differences

  • Measuring Outcomes

  • Relationships Between Variables

  • Practical Applications for Schoolwide and Targeted Assistance Programs

  • Questions


Federal government program assessment l.jpg

How Does Title I Measure Up?

Federal government program assessment


Federal government program assessment4 l.jpg
Federal Government Program Assessment

  • Program Assessment Rating Tool (PART)*

    • 25 questions to measure effectiveness of all federally funded programs on the following:

      • Clarity of purpose and design

      • Strategic planning

      • Management

      • Results

    • Classifies programs as Effective, Moderately Effective, Adequate, or Ineffective

*Information obtained from ExpectMore.gov


Federal government program assessment5 l.jpg
Federal Government Program Assessment

*Data obtained from ExpectMore.gov


Federal government program assessment6 l.jpg
Federal Government Program Assessment

  • Title I Results:

    • Moderately Effective*

      • Ambitious goals

      • Well-managed

      • Likely need to improve efficiency or address other problems in program’s design or management

    • http://www.whitehouse.gov/omb/expectmore/summary/10003320.2006.html

*Information obtained from ExpectMore.gov


Title i evaluation requirements l.jpg

Data and Assessment Requirements for Title I

Title I evaluation requirements


Title i data and assessment requirements l.jpg
Title I Data and Assessment Requirements

  • Comprehensive Needs Assessment

    • Must be based on academic achievement information about all students in the school, including:

      • Racial and ethnic groups

      • Children with disabilities

      • Children with limited English proficiency

      • Children with low incomes

      • Migrant students

    • Information must reflect achievement relative to the state standards


Title i data and assessment requirements9 l.jpg
Title I Data and Assessment Requirements

  • Comprehensive Needs Assessment

    • Help the school understand the subjects and skills for which teaching and learning need to be improved

    • Identify specific academic needs of students and groups of students who are not yet meeting state standards


Title i data and assessment requirements10 l.jpg
Title I Data and Assessment Requirements

  • Evaluation

    • Schools operating schoolwide programs must annually evaluate the implementation of and results achieved by Title I programs using data from state assessments and other indicators of academic achievement.

    • Determine whether the program has increased achievement of students, particularly students who had been farthest from achieving standards

    • Results should guide any revisions to the program plan


Title i data and assessment requirements11 l.jpg
Title I Data and Assessment Requirements

  • Evaluation of Program Components:

    • Schoolwide reform strategies

    • Instruction by highly qualified teachers

    • Parental Involvement

    • Additional support

    • Transition plans


Purpose of evaluation l.jpg
Purpose of Evaluation

  • Accountability

  • Objectivity

  • Comparability

  • Decision-Making tool

  • Allocation of resources

  • Determine areas of need

  • Measure progress



Understanding assessment14 l.jpg
Understanding Assessment

  • Operationally defining constructs of interest

    • Academic Achievement

    • Intelligence

    • Attitudes toward school

    • Parental Involvement

    • Attendance

    • Disabilities/Learning Disorders

    • Language ability

    • Access to services


Understanding assessment15 l.jpg
Understanding Assessment

  • Commonly used approaches to measurement:

    • Standardized tests

    • IQ tests

    • Surveys

    • Checklists

    • Rating scales

    • Structured interviews

    • Count variables


Understanding assessment16 l.jpg
Understanding Assessment

  • Test Theory

    • Complete/Perfect measurement is not possible

    • Items are drawn from an infinite pool that represents complete information for any construct

    • Collection of items provides an observed score

    • TRUE score = observed score + error

      • Goal of assessment: systematically minimize error to consistently detect meaningful measures that are as close to “True” score as possible


Understanding assessment17 l.jpg
Understanding Assessment

  • Test Theory

    • Items should discriminate; Item Characteristic Curve (ICC)-plots ability vs. probability of correct response


Understanding assessment18 l.jpg
Understanding Assessment

  • Reliability

    • Degree of consistency of measurement

  • Validity

    • Degree of accuracy & representativeness of measurement

  • Instrument can have high reliability but low validity

  • Valid instrument must be reliable


Understanding assessment19 l.jpg
Understanding Assessment

  • Sampling distributions

    • Each observed score is a sample statistic drawn from a distribution of multiple observations (actual or potential)


Understanding assessment20 l.jpg
Understanding Assessment

  • Confidence intervals

    • More accurate in terms of interpreting and communicating results


Understanding assessment21 l.jpg
Understanding Assessment

  • Interpreting results

    • Multiple measures always best

    • Keep the goal of testing in mind

    • External contributing factors (testing conditions)

    • Population considerations (culture, language, etc.)

    • Consider subscales when reported

      • Provide data on specific domains of interest

      • Useful for focused evaluation


Understanding assessment22 l.jpg
Understanding Assessment

  • Descriptive Statistics

    • Produce quantitative summaries of numbers

    • Describes a population or sample

    • Central tendency

    • Variability

    • Linearity/Non-linearity

  • Inferential Statistics

    • Hypothesis testing

    • Allows for prediction and examining real world relationships (cause & effect)

    • T-test

    • ANOVA

    • Linear modeling



Detecting differences24 l.jpg
Detecting Differences Between Groups

  • Why is it important to understand group differences?

    • Achievement gaps

    • Differences in availability and/or usability of resources

    • Title programs are aimed at reducing these gaps for academically at-risk students

  • Why is it important to use data to examine differences?

    • Objectivity

    • Consistency

    • Accuracy

    • Monitoring growth, improvement, changes over time


Detecting differences25 l.jpg
Detecting Differences Between Groups

  • Group differences in evaluation terms

    • Basic differences in groups or populations

      • T-test for statistical significance (compare group means for significant variability)

    • Differences in patterns and relationships

      • ANOVA, Linear tests (compare direction, association, and slope or rate of change in groups)


Detecting differences26 l.jpg
Detecting Differences Between Groups


Detecting differences27 l.jpg
Detecting Differences Between Groups

  • Using various data sources to determine meaningful differences

    • Multiple assessments, informants, data points over time, etc.

    • Provides a clear, overarching view

    • Compare measures to:

      • Validate your group comparisons

      • Negate or shed light on weaknesses from a single source


Detecting differences28 l.jpg
Detecting Differences Between Groups

  • Disaggregate

    • Slicing data to see what the picture looks like for different subgroups hidden within an average or basic percentage

    • Break down statistics into smaller components based on groups you are interested in comparing

      • Schools within a district

      • Teachers within a school

      • Race/ethnicity

      • ELL status

      • Disability status

      • Free/Reduced lunch status


Detecting differences29 l.jpg
Detecting Differences Between Groups

  • Disaggregate

    • Start with simple statistics: averages, percentages

      Average Score: 143

African American

121

Native

American

132

Asian

154

Latino

127

White

160


Detecting differences30 l.jpg
Detecting Differences Between Groups

  • Cross-Tabluate

    • Dicing data to further examine group differences; allows for multiple group categories to be considered at once

    • Useful for demonstrating how an education system advantages/disadvantages different groups of students, and how the situation might be improved


Detecting differences31 l.jpg
Detecting Differences Between Groups

Percent Passing 8th Grade Math Test =50%


Measuring outcomes l.jpg

Monitoring the Effectiveness of Interventions Between Groups

Measuring outcomes


Measuring outcomes33 l.jpg
Measuring Outcomes Between Groups

  • Why is it important to measure outcomes?

    • Program evaluation

    • Improve student programs

    • Fiscal responsibility

  • Why is it important to use data to monitor outcomes?

    • Objectivity

    • Consistency

    • Accuracy

    • Monitoring growth, improvement, changes over time


Measuring outcomes34 l.jpg
Measuring Outcomes Between Groups

  • Measuring outcomes in evaluation terms

    • Basic pre-post comparison

      • T-test

      • Antecedent monitoring and process evaluation

    • Comparison of relationships between outcomes

      • Linear models

      • Do outcomes vary for different groups or students?

    • Reporting outcomes

      • Raw score change

      • Percentage change


Measuring outcomes35 l.jpg
Measuring Outcomes Between Groups


Measuring outcomes36 l.jpg
Measuring Outcomes Between Groups

  • Longitudinal Data

    • Looking at data over a period of time (weeks, months, years)

    • Observing multiple time points can point out important patterns that cannot be detected with one or two measurements


Measuring outcomes37 l.jpg
Measuring Outcomes Between Groups

Student Achievement on State Tests: A Longitudinal Analysis


Relationships between variables l.jpg

Identifying Important Connections Between Groups

Relationships between variables


Relationships between variables39 l.jpg
Relationships Between Variables Between Groups

  • Why is it important to understand relationships between variables?

    • Identify patterns

    • Knowledge about what to expect for students

    • Identify areas to target for improvement

  • How does data help us examine relationships between variables?

    • Measurements represent constructs of interest

    • Objective evidence to support claims and provide ideas

    • CANNOT identify causes of outcomes


Relationships between variables40 l.jpg
Relationships Between Variables Between Groups

  • Relationships in Evaluation Terms

    • Scatterplots

    • Track scores or outcomes across levels of another variable to uncover connections

    • Allow us to examine real-world connections and understand relationships, form hypotheses

    • Allow us to visualize impact of cutoff scores and group criteria


Relationships between variables41 l.jpg
Relationships Between Variables Between Groups

Table/List Data

Scatterplot



Relationships between variables43 l.jpg
Relationships Between Variables Between Groups

  • Important considerations

    • Type of variables (continuous, no groups or categories)

    • Scales of variables

    • Cutoffs

    • Context of analysis, potential outcomes or decisions

    • Limitations


Practical applications l.jpg

Using Data to Inform Title I Programs Between Groups

Practical applications


Practical applications45 l.jpg
Practical Applications Between Groups

  • Example 1: Combining Data Sources to Identify Group Differences for Hispanic Students

  • Strategy:

    • Is there an achievement gap for Hispanic students in my system? Is it higher or lower than those at the state or national level?

      • Analyze data from various sources to provide support/rationale for your Title I plan

        • Multiple levels (local, state, national)

        • Multiple measures (different assessments)

        • Over time where available


Practical applications46 l.jpg
Practical Applications Between Groups

  • Examine Longitudinal National Data

    • Large sample, over time

    • Trend in NAEP mathematics average scores for 9-year-old students, by race/ethnicity*

*Data obtained from NAEP Long Term Trend Data


Practical applications47 l.jpg
Practical Applications Between Groups

  • Trend in White – Hispanic NAEP mathematics average scores and score gaps for 9-year-old students*

*Data obtained from NAEP Long Term Trend Data


Practical applications48 l.jpg
Practical Applications Between Groups

  • State 4th Grade CRCT Scores – Percentage of Students at Each Performance Level: Comparison by Race/Ethnicity*

    • DNM

    • Meets

    • Exceeds

*Data obtained from GaDOE 2009-10 State Report Card


Practical applications49 l.jpg
Practical Applications Between Groups

  • Average Math ACT Score for H.S. Seniors by Subgroups at the State and National Levels

    • State

    • Nation

*Data obtained from GaDOE 2009-10 State Report Card


Practical applications50 l.jpg
Practical Applications Between Groups

  • LEA 4th Grade CRCT Scores – Percentage of Students at Each Performance Level: Comparison by Race/Ethnicity*

    • DNM

    • Meets

    • Exceeds

*Data obtained from GaDOE 2009-10 LEA Report Card


Practical applications51 l.jpg
Practical Applications Between Groups

School Level 2009-2010 AYP Academic Performance Data*

*Data obtained from GaDOE 2009-10 LEA Report Card


Practical applications52 l.jpg
Practical Applications Between Groups

  • Examine population level statistics when useful

  • Examine differences across multiple outcome measures

  • Examine differences across age groups

  • Consider group sizes when interpreting and comparing data


Practical applications53 l.jpg
Practical Applications Between Groups

  • Example 2: Monitoring Outcomes of Supplemental Educational Services (SES) for School Districts

  • Strategy: Do students participating in SES show improvement over Non-SES students? Does SES contribute to improved academic achievement?

    • Compare outcomes of SES students to Non-SES students


Practical applications54 l.jpg
Practical Applications Between Groups

Average Scaled Scores on Achievement Outcomes Statewide

*Achievement Outcome Measures include CRCT, EOCT, and/or GHSGT


Practical applications55 l.jpg
Practical Applications Between Groups

Percentages of Students Meeting or Exceeding Standards on Achievement Outcomes Statewide

*Achievement Outcome Measures include CRCT, EOCT, and/or GHSGT


Practical applications56 l.jpg
Practical Applications Between Groups

Percentages of Students Moving to Higher Performance Category on Achievement Outcomes Statewide

*Achievement Outcome Measures include CRCT, EOCT, and/or GHSGT


Practical applications57 l.jpg
Practical Applications Between Groups


Practical applications58 l.jpg
Practical Applications Between Groups

  • Consider multiple dimensions and various relevant criteria when comparing data

  • Examine useful data embedded within larger data points

  • Consider group size

  • Consider growth over time where appropriate


Practical applications59 l.jpg
Practical Applications Between Groups

  • Example 3: Examining Achievement Data to Make Decisions About Title I Programs

  • Strategy: Is there a relationship between student level variables and achievement outcomes on state tests?

    • CRCT Reading Scores vs. CRCT Math Scores


Practical applications60 l.jpg
Practical Applications Between Groups


Slide61 l.jpg

Practical Applications Between Groups


Slide62 l.jpg

Practical Applications Between Groups


Practical applications63 l.jpg
Practical Applications Between Groups

  • Important to consider cutoffs for evaluation

  • Grouping affects data patterns

  • Useful diagnostic tool to visualize outcomes and relationships that exist between student variables


Practical applications64 l.jpg
Practical Applications Between Groups

  • Data Resources

    • Annual AYP Reports

    • State Test Reports

    • Online Assessment System (OAS)

    • GaDOE System Report Cards

    • NAEP National Report Card Data

    • System-level Data

    • School-level Data


Questions l.jpg
Questions? Between Groups


Contact information l.jpg
Contact Information Between Groups

Jessica Johnson

Operations Analyst

Office of School Improvement

Georgia Department of Education

[email protected]

(404)657-9864


Slide67 l.jpg

Thank you for attending this session! Between Groups

Your feedback is valuable.

Please return your evaluation form.


ad