1 / 62

May 8, 2013: Emily Rouge, LeeAnn Sell, & Stephanie Schmalensee

May 8, 2013: Emily Rouge, LeeAnn Sell, & Stephanie Schmalensee. planning with data. Workshop Objectives. Consider aspects of change and ways to stimulate a successful change initiative. Read and interpret program quality data Outline the strengths and weaknesses of your program

nodin
Download Presentation

May 8, 2013: Emily Rouge, LeeAnn Sell, & Stephanie Schmalensee

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. May 8, 2013: Emily Rouge, LeeAnn Sell, & Stephanie Schmalensee planning with data

  2. Workshop Objectives • Consider aspects of change and ways to stimulate a successful change initiative. • Read and interpret program quality data • Outline the strengths and weaknesses of your program • Create an effective improvement plan for their organization, based on data.

  3. Agenda 9:15-10:00 The change process and leading change 10:00-10:15 BREAK 10:15-11:30 Review aggregate and site-level data 11:30-1:00 LUNCH 1:00-2:00 Developing Program Improvement Plans 2:00-2:30 What’s Next in the QIS / Reporting 2:30 Closing/Evaluations

  4. No one Said Change was easy! “Stepping onto a brand-new path is difficult, but not more difficult than remaining in a situation, which is not nurturing to the whole woman.” ― Maya Angelou “I alone cannot change the world, but I can cast a stone across the waters to create many ripples.” ― Mother Teresa ” “

  5. Change Paired Activity

  6. Resistance to Change – Table group activity • Discuss your concerns about making changes to your program with others at your table • How do YOU feel about making changes in your program? • What kind of resistance do you anticipate? • What can you do to counter it?

  7. Youth Program Quality Intervention (YPQI)

  8. Where does change happen?What does this have to do with Program Quality? Kentucky Department of Education Policy Context Program Directors Organizational Setting Site Managers/Staff Youth Instructional Setting

  9. Break

  10. Data Folders: • Center Profiles • PQA Self assessments 3. Site Visits (CEEP) – Cycle 9 Grantees 4. External Assessments (Some Cycle 9)

  11. During this section of the session, you will: • Review your program data – in comparison to statewide data • Discuss your site’s strengths and challenges • Observe the picture your data paints

  12. 1. Center Profiles • Each site has a Center Profile from the 2011-2012 School Year (that offered programming that year) • Data elements included in analyses • Student participation • Outcomes for regular attendees • Grades • Federal Teacher Surveys • Outcomes for regular attendees who struggle academically • Grades

  13. 2011-2012 Center Profile Data: Student Participation

  14. 2011-2012 Center Profile Data: Academic Outcomes

  15. 2011-2012 Center Profile Data: Federal Teacher Survey Outcomes

  16. 2011-2012 Center Profile Data:Academic Outcomes for Struggling Students

  17. 2. Self Assessment Report • All sites that entered YPQA results into Scores Reporter • will have self assessment reports. If you did both a Youth PQA and a School-Age PQA, you will have both results.

  18. Quality Construct: The Pyramid of Program Quality

  19. Self Assessment Data • Each site has self-assessment results from data entered into Scores Reporter Keep in mind… • Observation scores represent a snapshot – this has limitations and value. • These are aggregate scores from multiple observations. • The overall story is more important than the individual numbers. • What you do with the data matters most.

  20. Self Assessment: Kentucky Aggregate Data

  21. Scores of all Fives • Not useful for identifying areas of improvement. • If you did receive all fives think about if this is true in every activity and with every staff member?

  22. Extreme Variation • Might signify a misunderstanding of what the tool is measuring. • The descriptors are very specific as far as what behaviors or evidence to look for.

  23. Self-Assessments: Becoming More Objective • There is typically a trend towards this distribution once raters are more comfortable with the process and the tool. • Program self assessment scores may be lower in subsequent years because raters are being more honest or more critical as they develop their reliability.

  24. 3.Site Visit Reports (CEEP) 28 visits to KY 21st CCLC programs between February 18th and April 18th Site Visit Activities Included: • Site coordinator interview • School day teacher interview • Standardized observation protocol for academic and enrichment activities Rating System: • 12 Items (rated on a scale of 1 to 4) • 1 = Must Address and Improve • 2 = Some Progress Made • 3 = Satisfactory • 4 = Excellent • 48 possible points

  25. Site Visits (CEEP) Purpose of 2013 Site Visits High School Programs Activities promote academic growth, remediation, and development Links to the regular school day Participants contribute ideas, make choices, and having positive experiences Establish partnerships and employ successful recruitment strategies Elementary/Middle School Programs Activities geared toward rigorous academic enrichment Links to the regular school day Individual support and opportunities for positive interactions for youth Relationships with schools, parents, and other community constituents

  26. Elementary and Middle School Site Visit Results Areas of Strength • Shares School Resources (3.65) • Community Based Partners (3.52) • School Personnel Involved (3.48)

  27. Elementary and Middle School Site Visit Results Potential areas for Growth • Supplemental Academic Enrichment (3.05) • Positive Interactions with Peers (3.10) • Active Learning (3.10) • Homework Help (3.10)

  28. High School Site Visit Results Types of Activities Offered (N=7)

  29. High School Site Visit Results Areas of Strength • School Personnel Involved (3.71) • Program is Well Integrated with the School and Shares resources (3.71) • Engagement with Community Based Organizations and Parents (3.71)

  30. High School Site Visit Results Areas of Potential Growth • Goal Setting Career Development (3.0) • Links to the School Day (3.0) • Intentional Recruitment and Retention Strategies (3.0)

  31. 4. External Assessment Report • Some Cycle Nine sites that were part of the YPQA Process in 2012 will have External Assessment results (which appear along-side self-assessment results).

  32. External Assessment Data • External assessments were conducted for 14 sites during the spring 2013 site visits. Keep in mind… • External assessment scores are always lower than self assessment scores. • Observation scores represent a snapshot – this has limitations and value. • The overall story is more important than the individual numbers. • What you do with the data matters most.

  33. Identifying Successes and Challenges Using the worksheet provided in your folder, determine your program’s strengths and weaknesses based on the data provided. Questions to ask yourself: What was one of your site’s biggest successes/strengths? What were some surprises? (Positive or Negative) Were there obstacles that you encountered?

  34. Reviewing Program Successes and Challenges • Were you able to identify program successes? • Were you able to identify program challenges? • Were data consistent across data elements?

  35. Putting it all together: Create the story of your data… • What is the message or story of your data? What do the numbers tell you? • In what ways is this story accurate? • What’s missing from the data? What important things about program quality do not come through? • Where are the gaps between what you WANT to provide and what the data says you ARE providing?

  36. Anybody hungry?? Break for Lunch: 11:30 am – 1:00 pm

  37. Completing the Program Improvement Plan Two copies of the Program Improvement Plan template have been included in your folder. Additional copies are available if you need them.

  38. Completing the Improvement Plan • Step 1: Enter District Name and Program Site • Step 2: Using the data from your folder and worksheet activity, develop one goal • Justify this goal by listing the related data element(s) that identified this area as needing to be strengthened.

  39. Completing the Improvement Plan: GOALS • When developing goals, remember: • Goals should be broad statements…but not too broad! • Examples of good goals: • Purposefully connect the afterschool program to the school day. • Provide opportunities for youth to reflect on their experiences in the after school program. • Provide activities geared towards improving reading skills • Examples of not so good goals: • Improve academic performance • Increase student engagement

  40. Completing the Improvement Plan: OBJECTIVES • Step 3: Develop objectives for the first goal

  41. Program Improvement Plan objectives should each be SMART: • Specific • Measurable • Attainable/Action oriented • Relevant • Timelined

  42. SMART Objectives: • S – Specific Who, What, When, How Much? • M – Measurable Can you prove it happened? • A – Attainable/Action-Oriented Does it use action verbs and explain what people will actually do? • R – Realistic Is it possible, given the program activities and circumstances? • T – Timelined What is the timeframe, and does it fit within the expected parameters?

  43. SMART Objectives…are SPECIFIC Not very specific… • Struggling students will demonstrate improved math achievement    Getting better… • Struggling students will demonstrate improved math achievement by increasing their math grade Even better… • 50% of struggling students will demonstrate improved math achievement by increasing their math grade Ideally, you have this level of specificity… • By May 2014, 50% of struggling students participating in 30 days or more will increase their math grade between the first and final grade period.

  44. SMART Objectives…are Measurable Which of the following objectives is most measureable? • Staff will use student achievement data to plan topics for tutoring sessions. • By May 2014, staff will hold at least three quarterly review sessions with teachers to discuss student achievement data and plan tutoring topics. • At least half of tutoring sessions provided will be based on individual student needs • Staff members will increase their awareness of students’ individual academic needs.

  45. SMART Objectives…are Attainable Which of the following objectives is most likely to be attained? • All program activities will be planned and implemented with student input. • By May 2014, all program activities will involve opportunities for students to share their work with other participants. • All students will have opportunities for reflection during program activities on a daily basis. • By May 2014, students will have opportunities for reflection during at least two program activities per week.

  46. SMART Objectives…are Relevant Which of the following objectives are most relevant to the following goal? GOAL: Increase the proportion of program activities provided that are directly aligned with academic standards. OBJECTIVES: • By May 2014, all program activities will involve opportunities for students to share their work with other participants. • By November 2013, literacy-based activities will be provided to students on a minimum of three days per week. • By May 2014, students will participate in at least four activities per week (outside of homework help) that are intentionally linked to Kentucky state standards.

  47. SMART Objectives…are Timelined Which timeline seems most useful? • By spring 2012, all program activities will involve opportunities for students to share their work with other participants. • Next year, staff will hold quarterly review sessions with teachers to discuss student achievement data and plan tutoring topics. • Beginning in October 2011, staff will hold quarterly review sessions with teachers to discuss student achievement data and plan tutoring topics. • From December 2011 through April 2012, students will have weekly opportunities to reflect on program activities.

  48. Completing the Improvement Plan: OBJECTIVES • Step 4: Double-check each objective to ensure it meets the criteria for SMART objectives.

  49. Completing the Improvement Plan: MEASURING PROGRESS • Step 5: Indicate when progress will be measured and what will be done to measure progress

  50. Completing the Improvement Plan: Activities • Step 5: Copy each objective to the chart on page 2 • Step 6: List 3 activities that will be conducted in order to meet the objectives • Activities should be specific and include timelines

More Related