1 / 30

Principles and Strategies of KEYS 2.0 Data Analysis and Interpretation GAE Training January 1009

Principles and Strategies of KEYS 2.0 Data Analysis and Interpretation GAE Training January 1009. Jacques Nacson Senior Policy Analyst NEA New Products and Programs jnacson@nea.org. Objectives. To examine how KEYS fits within the context of continuous school improvement

viho
Download Presentation

Principles and Strategies of KEYS 2.0 Data Analysis and Interpretation GAE Training January 1009

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Principles and Strategies of KEYS 2.0 Data Analysis and InterpretationGAE TrainingJanuary 1009 Jacques Nacson Senior Policy Analyst NEA New Products and Programs jnacson@nea.org

  2. Objectives • To examine how KEYS fits within the context of continuous school improvement • To review key statistical terms • To review basic research concepts • To consider basic principles of data analysis • To examine and learn to interpret the KEYS 2.0 school and district reports

  3. Overview of Action Research Model

  4. Statistical Terms • Mean, Median, Mode • 90th Percentile • Standard deviation • Factor analysis • Regression analysis • Correlation analysis • Statistical significance (.05 level, .01 level)

  5. Data Collection: “How much information do we need?” • at the MOST -- • Is the information collected compelling enough to convince any skeptic? • at the LEAST -- • Will the information collected at least create cognitive dissonance for the resistors?

  6. Triangulation of Data • Compensates for imperfections of data-gathering instruments. • When multiple measures yield same results, it increases confidence in results. • When multiple measures fail to yield same results, it raises important follow-up questions

  7. Data Collection:“What must we keep in mind?” Do the instruments and methods we plan to use measure what we claim they do? (Validity) 2.Do the instruments and methods we plan to use accurately measure the phenomena we are studying? (Reliability)

  8. Data Gathering Techniques • Interviews • Checklists • Diaries • Logs • Questionnaires • Audio or video tapes • Photographs • Consultative advice

  9. General Principles of Data Interpretation Before Looking at Your KEYS 2.0 Results

  10. General Principles to Consider Specifically Related to Your KEYS 2.0 Data

  11. Understanding the Graph for each KEY Labels Vertical Axis Indicators (groups of questions that measure the same concept) KEY 1. Shared Understanding and Commitment To High GoalsRespondents Provide Direct Instruction to Students Horizontal Axis Measure of quality: 5 point scale Left side Disagree (low value) Right side Agree (high value)

  12. Understanding the Graphs for each KEY Labels SchoolAverage All Schools Average Standard Deviation 90th Percentile Score • Data Points • School average (Black) • All schools average (Red) • 90th percentile score (Yellow) • Length of the horizontal bar • (Blue/Purple) 1 standard deviation above and one below the • school average (measure of • agreement or consensus) • The Goals for your school in terms of continuous improvement for each indicator • The school average moving • continuously toward the right side • (agree – high value of quality for • that indicator) • At the same time, reduce the • standard deviation (narrow the • length of the horizontal bar, • meaning greater agreement among • respondents)

  13. Hierarchical organization – Keys—Indicators--items Rating- the degree to which the respondents believe that the indicator accurately describes the school: average of all respondents on the questions that make up each indicator 1 = strong belief that indicator does not describe the school 2 = some belief that indicator does not describe the school 3 = neutral 4 = some belief that the indicator describes the school 5 = strong belief that the indicator describes the school KEYS Vocabulary

  14. KEYS Vocabulary (Con’d) • Consensus - level of agreement among respondents on the rating for each indicator • Low consensus = wide band • High consensus = narrow band

  15. Demo School Report with Links to Resources

  16. Demo School Report with Links to Resources

  17. Demo School Report with Links to Resources (Drilling Down)

  18. Demo School Report with Links to Resources (Drilling Down)

  19. Demo School Report with Links to Resources (Drilling Down further)

  20. Example of a District Report (Aggregate Scores for Key 6)

  21. Example of District Report Distribution of School Scores for Indicator 6.5 (Interventions)

  22. Example of a District Report Aggregate Scores for Key 5 (Resources)

  23. Example of District Report: Distribution of School Scores for Indicator 5.4 (Safe & Healthy Learning Environment)

  24. PROCESSING THE DATA • Looking at your groups’ assigned Key Graph answer these questions: • On which indicators is there the greatest agreement (the shorter bar)? • On which indicators is there the least agreement (the longer bar)? • Why might that variability of perspective exist?

  25. PROCESSING THE DATA – Cont’d • How might you come to greater consensus on this? • Which of the KEYS indicators have both a high mean score and also have a shorter bar? (Strengths) • Which of the indicators have both a low mean score and a wide degree of variability? (Areas for Improvement) • Which items have the highest correlation with student achievement?

  26. Analyzing and Interpreting KEYS 2.0 Data - The Logic Sequence • Are results surprising or do they confirm what you know or believe?2. If the results are surprising, what might have caused the difference? • 3. How might you go about checking further to determine the validity of the findings? • 4. If you determine that the findings are valid and the issue is important and relevant to your particular situation, what might be the reasons for such findings? • 5. What is the most likely cause? The underlying or root cause? How did you arrive at this conclusion?

  27. Analyzing and Interpreting KEYS 2.0 Data -The Logic Sequence – Cont’d • 6. What are some possible actions/solutions that you and your colleagues might take to alter or reverse and improve the condition? • 7. What is the best possible solution given your particular context or situation? • 8. What are the steps that you must plan and take in order to implement the best possible solution? • 9. What are the resources and the skills needed to implement the solution successfully? • 10. How would you know if your actions/solutions were successful in ameliorating the condition?

  28. Focus group Questions: • Do the KEYS data support or refute other data previously collected? • What professional development opportunities are indicated by the data? • What barriers to achieving previously stated goals surfaced during your discussion? • What other learning does your focus group wish to share with the larger group?

  29. Steps a School Might Take Once KEYS Preliminary Analyses are Completed GAPS: Decide on one indicator or a group of indicators where gap (s) exist. RELEVANCE: Reflect with the “team” on the relevance, importance and priority of the selection. (Get feedback from stakeholders) DATA COLLECTION-VALIDATION: Consider the need to collect additional data to validate the KEYS findings. DATA COLLECTION-DIAGNOSIS AND REFINEMENT: Examine what is the “root” cause of the problem.

  30. Steps a School Might Take Once KEYS Preliminary Analyses are Completed • THEORY OF ACTION: Identify and select the most appropriate solution for your context. (Get Feedback from stakeholders) • ACTION PLANNING: Set SMART goals and develop specific action/project plans. (Get feedback and commitments from stakeholders) • IMPLEMENTATION: Action plans must be implemented for improvement to occur. • DATA COLLECTION-EVALUATION: Both process and product evaluations are necessary for learning to happen. • BACK TO STEP 1: Repeat the cycle

More Related