Cecil j picard center for child development university of louisiana at lafayette sessions 22a 22b l.jpg
This presentation is the property of its rightful owner.
Sponsored Links
1 / 26

Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B PowerPoint PPT Presentation


  • 103 Views
  • Uploaded on
  • Presentation posted in: General

Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B. Holly Howat Oliver Winston Greg Crandall. PBS in Louisiana: 2006-2007 Evaluation Findings. Understanding the power of data-based decisions.

Download Presentation

Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Cecil j picard center for child development university of louisiana at lafayette sessions 22a 22b l.jpg

Cecil J. Picard Center for Child DevelopmentUniversity of Louisiana at LafayetteSessions 22A & 22B

Holly Howat Oliver Winston Greg Crandall


Pbs in louisiana 2006 2007 evaluation findings l.jpg

PBS in Louisiana: 2006-2007 Evaluation Findings

Understanding the power of data-based decisions


Cecil j picard center for child development l.jpg

Cecil J. Picard Center for Child Development

The Cecil J. Picard Center for Child Development was established in 2005 at the University of Louisiana at Lafayette.

Our mission is to improve Louisiana by focusing on its children. The Center’s is dedicated to providing high quality, rigorous evaluation of programs that addresses learning from birth to adulthood.

The Center is proud to partner with many state agencies including the Department of Education. Our Center’s work with DOE includes the evaluation of the implementation of Positive Behavior Support.


Evaluation focus l.jpg

Evaluation Focus

  • School-wide Evaluation Tool

  • Correlation Analysis

  • Behavioral Characteristics

  • Academic Characteristics

  • Risk and Protective Factors Characteristics

  • Qualitative Results for District-Wide Implementation


Slide5 l.jpg

Positive Behavioral SupportSchools Trained2006-2007 School Year


Slide6 l.jpg

Positive Behavioral SupportSchools Trained2006-2007 School Year


School wide evaluation tool l.jpg

SET Total and Subcategories Mean Scores for

All Sampled PBS Schools

100

Total Score

Expectations Defined

80

Expectations Taught

60

Reward System

Percentage

Violation System

40

Monitoring

Management

20

District Support

0

Comparison of 2006-07 SET Total Scores Across

Cohorts by Years of Experience

100

Cohort 4

N=7 Schools

80

Cohort 3

60

N=11 Schools

Percentage

Cohort 2

40

N=15 Schools

20

Cohort 1

N=6 Schools

0

1 Year 2 Years 3 Years 4 Years

School Wide Evaluation Tool

Most sampled schools had strengths with monitoring and district support and had difficulty with expectations taught.

The more experience a sampled school has with universal level PBS, the better they are at implementing it.


Correlation analysis l.jpg

SET Scores Relation to Benchmark Scores

100

80

60

SET - Benchmarks

SET

Linear (SET - Benchmarks)

40

20

0

0

20

40

60

80

100

Benchmarks

Correlation Analysis

This graph indicates that there is statistical significant correlation between School-wide Evaluation Tool scores and Benchmarks Of Quality scores.


Behavioral characteristics suspension rates l.jpg

Change in ISS Rates from 2003-04 to 2005-06

10

8

6.86

6.73

6

Percentage

4

2

1.05

1.04

0

Cohort 1

Cohort 2

Cohort 3

Cohort 4

Change in OSS Rates from 2003-04 to 2005-06

4

2.67

3

2

1

0.45

Percentage

0.02

0

-1

-2

-2.45

-3

Cohort 1

Cohort 2

Cohort 3

Cohort 4

Behavioral Characteristics:Suspension Rates

Sampled schools with over two years of PBS implementation had much lower increases in in-school suspension rates.

A similar pattern existed for out-of-school suspension rates.


Academic characteristics test scores and retention rates l.jpg

Retention Rates Over Time by Year

20

Cohort 1

N=9 Schools

15

Cohort 2

N=24 Schools

Cohort 3

Percentage

10

N=17 Schools

Cohort 4

N=8 Schools

5

State Ave.

N=1475 Sch.

0

2003-04

2004-05

2005-06

Academic Characteristics:Test Scores and Retention Rates

A general pattern of decline in retention rates can be observed in this sample.

From the data collected for 2006-2007, there was no discernible correlation of PBS implementation to academic outcomes on test scores.


Risk and protective factors l.jpg

PBS Sample School Results on CCYS Protective

Factor: Rewards for Pro-social Behaviors

100

80

2004

N=34 Schools

60

Percentage

40

2006

N=34 Schools

20

0

Grade 6

Grade 8

Grade 10

PBS Sample School Results on CCYS Risk

Factor: Low Commitment to School

100

80

60

2004

Percentage

2006

40

20

0

Grade 6

Grade 8

Grade 10

Risk and Protective Factors

Protective factors increased in Grades 6 and 8, particularly the rewards for pro-social behavior.

Risk factors decreased in Grades 6 and 8, particularly a low commitment to school.


Qualitative results for district wide implementation l.jpg

Qualitative Results for District-Wide Implementation


Qualitative results for district wide implementation13 l.jpg

Qualitative Results for District-Wide Implementation


Data driven decision making l.jpg

Data Driven Decision Making

At the Picard Center for Child Development, we collect and analyze data to inform policy makers so they can informed decisions.

School and districts can also collect and analyze data so they can make informed decisions.


Data driven decision making15 l.jpg

Data Driven Decision Making

PURPOSE:

To review critical features & essential practices of data collection and the analysis of data for interventions


Slide16 l.jpg

School-wide Positive Behavior Support Systems

Classroom

Setting Systems

Non-classroom

Setting Systems

Individual Student

Systems

School-wide

Systems


Data collection examples l.jpg

Data Collection Examples

An elementary school principal found that over 45% of their behavioral incident reports were coming from the playground.

High school assistant principal reports that over two-thirds of behavior incident reports come from our cafeteria.


Data collection examples18 l.jpg

Data Collection Examples

A middle school secretary reported that she was getting at least one neighborhood complaint daily about student behavior during arrival and dismissal times.

Over 50% of referrals occurring on “buses” during daily transitions.


Data collection examples19 l.jpg

Data Collection Examples

At least two times per month, police are called to settle arguments by parents & their children in parking lots.

A high school nurse lamented that “too many students were asking to use her restroom” during class transitions.


Data collection questions l.jpg

Data Collection Questions

  • What system does the parish utilize for data collection?

  • How is the data system being used in each school setting?

  • How frequently are data collection system reports generated (bi-weekly, monthly, grading period and/or semester reports )?


Minimal school level data collection needs l.jpg

Minimal School-Level Data Collection Needs

  • Minor referrals

  • Major referrals

  • Referrals by staff members

  • Referrals by infractions

  • Referrals by location

  • Referrals by time

  • Referrals by student


Minimal district level data collection needs l.jpg

Minimal District-Level Data Collection Needs

  • Majors referrals (ODRs)

  • Referrals by Incident

  • Referrals by Infractions

  • Times of incidents

  • Locations of incidents (what school and where in the school)


Data analysis questions l.jpg

Data Analysis Questions

  • How is the data displayed (graphs, tables, etc.) and is it effective?

  • What are the outcomes of data review?

  • Are data-based decisions reached?

  • How are data-based decisions monitored for effectiveness?


Minimal school level data analysis needs l.jpg

Minimal School-Level Data Analysis Needs

  • PBS team should be part of analysis process

  • Data should be reviewed to determine patterns of problem behaviors

  • Decisions should be based upon data presented

  • Decisions should include an intervention that can be successfully implemented and monitored.


Using data to make decisions l.jpg

Using Data to Make Decisions

  • What interventions are needed to respond to problem behaviors?

  • How do we implement the intervention throughout the school?

  • What is the time table for the intervention to show a decrease in undesirable behavior?


Contact information l.jpg

Contact Information

Dr. Holly Howat

337-482-1552

[email protected]

Mr. Oliver Winston

337-365-2343

[email protected]

http://ccd-web.louisiana.edu/


  • Login