1 / 18

Using Data To Tell Your Story

Using Data To Tell Your Story. Vision, Mission, and Goals: Framing Evaluation and Assessment. What is your vision? (What is your over-arching, driving concern?) What is your mission? (What role do you see yourself playing in the realization of your vision?)

jenn
Download Presentation

Using Data To Tell Your Story

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Data To Tell Your Story

  2. Vision, Mission, and Goals: Framing Evaluation and Assessment • What is your vision? (What is your over-arching, driving concern?) • What is your mission? (What role do you see yourself playing in the realization of your vision?) • What goals do you have? (What are some short-term way-markers you need to reach in order to fulfill your mission?)

  3. Importance of Evaluation • What gets measured gets done. • If you don’t measure results, you can’t tell success from failure. • If you don’t recognize failure, you can’t correct it. • If you can’t see success, you can’t reward it. • If you can’t see success, you can’t learn from it.

  4. Why collect data? • Condition of funding • Support claims of program effectiveness • Provide information that can improve program • Engage staff • Demonstrate elements of a QUALITY program

  5. Creating an Evaluation Plan—Questions to Answer • What are our program’s goals? • Are some activities more effective than others? • What changes in knowledge, attitudes, or behaviors will result from program activities? • Do our funders require any specific information? • Are some activities more popular than others? • As a result of the time and effort everyone is devoting to the program, what differences have we made?

  6. Where do we start?The Logic Model

  7. What Does a Logic Model Do? • Summarizes key elements of a program • Identifies the rationale behind the elements • Articulates the short- and long-term outcomes and how they can be measured • Shows the cause-and-effect relationships between a program and its outcomes Moving Towards Success: A Framework for After-School Programs (Collaborative Communications Group, 2005, p. 3).

  8. Setting Goals • S.M.A.R.T. Goals • Specific—Ask the 6 “Ws” • Measurable—Establish Indicators Measuring Your Goals • Attainable—Set goals that you can build capacity for and actually reach • Realistic—Be willing and able to reach for the goal and you truly believe it can be accomplished • Timely—Set a time frame

  9. Quantitative Surveys Collection of demographic information about participants in the program Information reports (grades, test scores, comparisons of crime statistics, detention reports) Qualitative Anecdotal success stories Focus groups Interviews Observations Self-assessment Other documents (newsletters, meeting minutes, and other sources of information) Measuring Our Goals

  10. Types of Data • Make an effort to include information about each of the following categories: • Program delivery (e.g., attendance, activities, etc.) • Key stakeholder satisfaction (e.g., students, parents, teachers) • Program outcomes (both short-term and long-term)

  11. Data AND Emotion • You have both. Use both. • Data sources: • School district or 21st CCLC Profile and Performance Information Collection Systems (PPICS) • Local Evaluation Data • Emotional Stories • Success Stories • Everyday Stories

  12. Evaluation • Evaluation is the process of analyzing data to assess what works and what does not work in achieving goals. • Data has no meaning on its own. Meaning is a result of human interaction with data.

  13. Continuous Program Improvement • How do we set up a process for our entire staff to become engaged in looking at data to improve our program?

  14. Example: The “Data Wise” Framework Boudett, K.P., City, E.A., and Murnane, R.J. (2005). Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning. Harvard Education Press: Cambridge.

  15. The Data Wise Improvement Process • Prepare • Organize for Collaborative Work • Build Assessment Literacy • Inquire • Create Data Overview • Dig into Student Data • Examine Instruction • Act • Develop Action Plan • Plan to Assess Progress • Act and Assess

  16. Telling Your Story: The Stats • Choose the BEST stats for your program • Place a particular emphasis on 3-4 strong data points each year and intersperse these with narrative stories • Power of outcome information can be greatly strengthened if you can make comparisons with students that did not attend your program

  17. Telling Your Story: The Heart • Tell stories from a few different perspectives, including students, parents, and staff • Vignettes can be told through an interview format, outstanding quote, or story • Photos in combination with vignettes are a powerful combination, especially when the photos are taken by the kids themselves

  18. Telling Your Story: Visually

More Related