1 / 50

Utilizing Data Informed Decision-Making to Improve the PSSA Performance of CTE Students

Utilizing Data Informed Decision-Making to Improve the PSSA Performance of CTE Students. PACTA PIL October 18, 2010. Agenda. 11:15 – 12:45 12:45 – 1:30 1:30 – 3:00 3:00 – 3:15 3:15 – 5:00 5:00 – 5:45 5:45 – 6:00. Overview of the School Improvement Process Data, Data, Everywhere

regis
Download Presentation

Utilizing Data Informed Decision-Making to Improve the PSSA Performance of CTE Students

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Utilizing Data Informed Decision-Making to Improve the PSSA Performance of CTE Students PACTA PIL October 18, 2010

  2. Agenda • 11:15 – 12:45 • 12:45 – 1:30 • 1:30 – 3:00 • 3:00 – 3:15 • 3:15 – 5:00 • 5:00 – 5:45 • 5:45 – 6:00 • Overview of the School Improvement Process • Data, Data, Everywhere • LUNCH • Data Analysis • Break • Root Cause Analysis Procedures • Sharing & Reporting Out • Wrap-up and Evaluation

  3. What impacts (has an effect on) heart health? • Family History • Diet • Exercise • Smoking

  4. What are indicators of heart health? • Family History • Diet - Weight • Exercise – Resting heart rate • Cholesterol • Triglycerides • Blood pressure

  5. What are the impacts and indicators of heart disease? If we are to improve the health of our hearts, we need to be aware of both the impacts of heart health as well as the indicators of heart health. Another example: STEELERS FOOTBALL!

  6. What impacts Student Achievement?

  7. What are indicators of Student Achievement?

  8. Overview of the School Improvement Process

  9. School Improvement Cycle

  10. Data Informed Decision Making Cycle for __________ IMPROVEMENT Data Data Data analysis Data Remember: Numbers are our friends Data Strategic Planning Data Did it work? Data Data Data Resources

  11. Using Data to Improve Learning for All: A Collaborative Inquiry Approach by Nancy Love et al. Added by Shula

  12. PDE’s Getting Results

  13. Focused and Un-forcused Improvement Cycles a la Bernhardt

  14. Impacts and Indicators of Student Achievement

  15. Data, Data Everywhere

  16. Demographics School Processes Perceptions Student Learning Types of Data a la Bernhardt – Indicators & Impacts

  17. What are your Impacts and Indicators? • Identify your Impact and Indicators by their type • Blue Dots – Student Learning • Yellow Dots – Demographics • Red Dots – School Processes and Programs • Green Dots – Perceptions

  18. Summative Assessments PSSA NOCTI NAEP Formative Assessments Informal teacher observations Interim Assessments 4Sight Grades Diagnostic Assessments CDT Multiple Measures of Student Learning Indicators

  19. Multiple Measures of Student Learning 4Sight PSSA NOCTI

  20. Longitudinal Data Analysis of annual performance Analysis of across the years Analysis of cohort groups across the years (8th grade vs. 11th grade) PVAAS Multiple Measures of Student Learning – over time

  21. PSSA NOCTI Grades What are YOUR measures of Student Learning Data?(summative, formative, interim, diagnostic)

  22. Typical Data: Ethnicity IEP Economically Disadvantaged Gender Mobility Enrollment Attendance Teacher Demographics? Multiple Measures of Demographics Impacts

  23. Demographics to Disaggregate Disaggregation is not a problem solving strategy….. It’s a problem finding strategy.

  24. Are all students performing at the same level? IEP students? LEP students? Economically disadvantaged students? Is the achievement gap (between high and low poverty students) decreasing or increasing? Do students who attend school every day get better grades? Are achievement levels higher for those students who stay in a school building for two or more years? Student Learning AND Demographics

  25. What are your demographic measures? Teachers Students Community

  26. Typical Data: Description of school programs and processes. How are students identified for programs and services? Multiple Measures of School Processes (Programs) Impacts and Indicators

  27. Student Learning AND Demographics AND School Processes Are there differences in achievement scores (or in rates of progress) for 11th grade females and males by the type of career program in which they are enrolled?

  28. What are your programs or procedures/processes? • Tutoring • Title I • Grading policy • Enrollment into a CTC • Part time CTC transportation issue

  29. Typical Data: Perceptions of Learning Environment School Climate Values and Beliefs Observations Multiple Measures of Perceptions Impacts

  30. Student Learning AND Demographics AND Perceptions Do students of different ethnicities perceive the learning environment differently, and do they score differently on standardized achievement tests consistent with these perceptions?

  31. Teachers’ What are your measures of perception? Parents’ Sending Districts’ Students’

  32. Identify your Impact and Indicators by their type • Blue Dots – Student Learning • Yellow Dots – Demographics • Red Dots – School Processes and Programs • Green Dots – Perceptions • What’s missing? • What additional data should be collected? Examined? Considered?

  33. Data Analysis Now that the data is gathered (or it is on the to be gathered list), it’s time to analyze the data. Remember…..”Numbers are our friends”

  34. Data Analysis • “Gathering” your PSSA data using the Feeder Report from eMetric • What percent of 11th graders (in 2010) scored Below Basic, Basic, Proficient, and Advanced in Reading? Math? (These are this years’ 12th graders) • What about the class of 2010 (PSSA grade 11 in 2009)? The class of 2009 (PSSA grade 11 in 2008)? • Examine the three year trend of 11th grade performance in reading and math. • Observations – just the facts! • Are more students reaching proficiency? • Are fewer students below basic? • Repeat the above looking at: • Current 9th graders (8th graders in 2010) • Current 10th graders’ 8th grade PSSA scores (from 2009) • Current 11th graders’ 8th grade PSSA scores (from 2008)

  35. Data Analysis • Disaggregation is a problem finding strategy! • 11th Grade • By Program • By Gender • By Sending District • By Reporting Categories • 8th Grade • By Program • By Gender • By Sending District • By Reporting Categories

  36. Why all this data? “Root Cause Analysis”

  37. Rule #1 – no blaming othersThe Blame Poem

  38. Observation and Reflection What are you seeing? JUST THE FACTS! What are you thinking about the results? What’s ‘causing’ these results? • More females are proficient than males. • Over the past three years, the percent of students reaching proficiency has increased. • The percent of students below basic has remained constant over three years. • Students don’t arrive at ‘my grade level’ as prepared as they should be. • Support programs are lacking.

  39. Root Cause Analysis (Paul Preuss) • Definition – the deepest underlying cause, or causes, of positive or negative symptoms within any process that if dissolved would result in elimination, or substantial reduction, of the symptom. • Root cause analysis eliminates patching and wasted effort. • Root cause analysis conserves scarce resources. • Root cause analysis induces discussion and reflection.

  40. How do you know you’ve ‘found’ the root cause? • You run into a dead end asking what caused the proposed root cause. • Everyone agrees that this is a root cause. • The cause is logical, makes sense, and provides clarify to the problem. • The cause is something that you can influence and control. • If the cause is dissolved, there is realistic hope that the problem can be reduced or prevented in the future.

  41. “School improvement teams and others using root cause analysis often wonder when to stop seeking cause and make the decision that sufficient data and effort have been used to arrive at a reasonable root. This is often a judgment call that will improve with experience. Often, the lack of data and the pressures of time frustrate the effort and force it to halt at a level below the surface symptom, but perhaps not as deep as it must ultimately go.” (Preuss 2003)

  42. Root Cause Analysis – prerequisites • Key Indicators of Student Success • Measures of each indicator • Desired Ideal Condition of the indicator (e.g., 56% proficient or better) • Gap between the desired ideal condition and the present condition • Is this gap a priority issue? • Goal statement • Search for Root Cause • Possible strategies for improvement

  43. Root Cause Processes • Questioning the Data • The Diagnostic Tree • The Five Whys • Force Field Analysis • Throughout each process, reflect back on to your list of impacts and indicators

  44. Questioning Data • “What do you see?” • “What questions do you have about what you see?” Questioning the Data a la Dr. Shula: • What do you see? – JUST THE FACTS • What are you thinking/feeling/believing about what you see? • What other data or data analysis might shed more light on the issue?

  45. The Diagnostic Tree • The “Red Flag” event or priority issue • Location Level • Hypotheses Level • PSSA Math scores are below AYP target • Location - incoming 9th graders from X Middle Schools • Hypotheses – Is this related to Student Demographics? Curriculum? Instruction? System Processes? Organizational Culture? External Factors:

  46. The Five Whys • Why? • Why? • Why? • Why? • Why? • Team: Whydo we have so many class tardies? • Students: Because we do not have enough time. • Team: Why don’t you have enough time to get from one class to another? • Students: Because 4 minutes isn’t enough time to get from one end of the building to the next and go to locker or rest room. • Team: Why only 4 minutes? • Principal: Because we wanted to reduce the time that students were in the halls. • Team: Why did we want to reduce the hall time? • Principal: Because we wanted to reduce disciplinary problems. • Team: Why did we want to reduce disciplinary problems? • Principal: We wanted to improve school safety and climate.

  47. Force Field Analysis • Driving Forces and Restraining Forces • Driving Forces apply pressure to move in a direction of change • Restraining Forces apply pressure to remain in place • Either the driving forces have to be increased or the restraining forces have to be decreased.

  48. Pulling it together with the your Final Report

  49. The Final Report – PACTA PIL Program • Brief introduction • Three year analysis of reading and math scores • Strengths • Deficiencies • Root Causes for each CTE Program • Action Plans to address the Root Cause • Timeline for Implementing & Monitoring

  50. Sharing & Reporting Out New insights? Additional data/information to be gathered and examined? New theories? Next steps

More Related