1 / 20

MOHealthWINS Vision for Missouri

MOHealthWINS: A Transformative Opportunity for Missouri Data Collection and Performance Measures MOHealthWINS Summit March 13-14, 2012 J. Cosgrove, Cosgrove & Associates. MOHealthWINS Vision for Missouri.

rafiki
Download Presentation

MOHealthWINS Vision for Missouri

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MOHealthWINS: A Transformative Opportunity for Missouri Data Collection and Performance MeasuresMOHealthWINS Summit March 13-14, 2012J. Cosgrove, Cosgrove & Associates

  2. MOHealthWINS Vision for Missouri Connecting target populations to employment opportunities in the State’s growing health care industry MOHealthWINS’ four priorities: • Accelerate Progress for Low-Skilled and Other Workers • Improve Retention & Achievement Rates To Reduce Time To Completion • Build Programs That Meet Industry Needs, Including Developing Career Pathways. • Strengthen Online & Technology-Enabled Learning

  3. Cosgrove & Associates Data Collection and Evaluation Vision: Ensure the systematic and timely collection of data required to meet DOL reporting and accountability requirements, and Consortium and individual college’s research and evaluation information needs. • Cosgrove & Associates will partner with individual colleges to ensure the collection and analysis of all data required to meet U.S. Department of Labor accountability requirements • Cosgrove & Associates will help the Consortium and individual colleges move from ideas and opinions to an understanding of program effectiveness that is precise, predictive, valid and reliable.

  4. Survey Feedback What You Told Us: • A list of EVERYTHING we need to track and report on. • We would like to see a uniform plan for collecting the data . • A plan for the uses of it after it is collected. • An organized system for sending the collected data in. • Be clear about what data we need to collect and in what format so we can determine the extent to which we can use our ERP systems or must develop new data collection and reporting systems. • It must satisfy not only the reporting requirements of DOL around "trainees" but also track how well the new instructional strategies are working. • We are looking forward to having real-time data that we can analyze to support our continuous improvement throughout the grant. We also appreciate the flexibility that this will support as we make changes in and enhancements to our programming as the work progresses. • Provide us the technical assistance and recommendations that will help us continuously improve. • Support our endeavors and provide guidance.

  5. MOHealthWINS Data Collection & Performance Measures: What Will I Take Away From This Session? • Identification of DOL Primary Research and Outcome Questions • Description of Data Collection, Reporting, Evaluation Workplan and Timeline • Required Data Elements and Data Collection Process • Value of IDID • DOL Evaluation Framework: Participant and Comparison Cohorts • You Are NOT ALONE: Next Steps and Who To Call For Help • How To Fill Out Your Bracket (MIZ….ZOU) and Get Free Drinks At The Hotel Bar!!!

  6. Conceptual Framework For Implementation & Outcome Evaluation Primary Questions: • Are individual college strategies and statewide strategies being implemented as designed in a timely manner? • Are program participants, who enrolled with low academic skills improving such skills in a timely manner? How does the progress of grant participants compare to the progress of similar students who are not involved in the grant strategies? • Are grant participants being retained at the appropriate targeted rate? How does the grant participant retention rate compare to similar students who are not involved in the grant strategies? • Are grant participants completing the desired degree/certificate award in a timely manner? How does the grant participant completion rate compare to similar students who are not involved in the grant strategies? • Are grant program completers securing employment in occupations targeted by the Consortium? How does the grant participant employment rate compare to similar students who are not involved in the grant strategies? • Are employers satisfied with overall employment preparation of grant program completers? • Are individual colleges demonstrating the capacity to use performance tracking and program evaluation data to continuously improve grant designated strategies and programs?

  7. Data Collection, Reporting and Program Evaluation Project Phases and Steps: Research, Reporting & Evaluation Accomplished in 3 Phases Phase 1 Timeframe: 2/20/12 to 9/15/12 • Review DOL and Consortium reporting needs and research/evaluation questions • Examine data/information needs *****We Are Here**** • Train campus users in regard to DOL required data elements and student cohorts • Develop statewide data system • Pilot data collection process and tracking system • Train campus users on data system • Begin data collection process Phase 2 Initial timeframe: 8/1/12 to 12/1/12, however phase 2 will be ongoing for the life of the project • Data collection start-up and program implementation monitoring • Participant and cohort development and description • DOL reporting and Consortium research/evaluation information sharing Phase 3 Timeframe: is 1/1/13 through 9/30/14 • Continuous sharing of research/evaluation information with individual colleges and the Consortium. The goal of Phase 3 is to build the capacity of individual colleges and the Consortium to use research/evaluation for continuous improvement processes.

  8. TWO SIDES TO ALL DATA: THE IDID MODEL IDID Framework: Behind Every Number Is A Student and A Life To Be Changed! • INQUIRE: What do we need to report and what do we want to know? • DISCOVER: Data collection • INTERPRET: Which programs/strategies worked and why did they work? • DEVELOP ACTIONS: Based on thoughtful interpretation of data, what actions need to occur to ensure continuous program improvement?

  9. The Value of Moving From Data Collection To Thoughtful Interpretation of Information Source: Mark Milliron, President Catalytic Conversations

  10. Data Collection Model: Statewide Integration Multiple Points of Participant Entry Data Returned To Campuses For Program Improvement • Colleges must be able to identify grant participants by program (credit and non-credit) • The Statewide Consortium is the reporting unit for DOL Performance & Accountability Measures • Data will be used at the college level for continuous improvement C&A provides data analysis by program (Consortium & campus level) Colleges identify participants and submit student unit record data to C&A C&A cleans data & builds State data file C&A completes DOL quarterly & annual reporting requirements DOL Participant & Performance Outcomes C&A submits State data file to MERIC for multiple system matching MERIC returns State file to C&A

  11. Data Collection Process: USDOL Data Requirements I. Initial Point of Contact • Social Security Number • Name and Contact Information • U.S. Citizenship • Employment Status At Initial Enrollment • Wages At Initial Enrollment • Gender • Ethnicity/Race • Age • Disability Status • Veteran Status • TAA Eligible II. College Student System (Credit and Non-Credit) • Social Security Number • TAA Eligible and Grant Participant • Name and Contact Information • U.S. Citizenship • Employment Status At Initial Enrollment • Wages At Initial Enrollment • Gender • Ethnicity/Race • Age • Disability Status • Veteran Status • Pell Eligibility • Student Status (full or part-time) • Highest Education Level Completed Upon Entry • Educational Goal (degree or certificate) • Basic Skill Deficiency Reading-Assessment Score • Developmental Reading Level---One Level Below College Level, Two Levels Below College Level, or Three or More Levels Below College Level. • Basic Skill Deficiency English-Assessment Score • Developmental English Level---One Level Below College Level, Two Levels Below College Level, or Three or More Levels Below College Level. • Basic Skill Deficiency Mathematics-Assessment Score • Developmental Mathematics Level---One Level Below College Level, Two Levels Below College Level, or Three or More Levels Below College Level. • Campus Code • Program Code • Term Code and Start Date • Credit or Non-Credit Code • Entering Student Status • Term Credit Hours Attempted • Term GPA • Term Credit Hours Completed

  12. Data Collection Process: USDOL Data Requirements Continued III. Tracking and Performance Outcome Data • Each college will track implementation of programs and related strategies • Developmental Skill Improvement (course grades, re-test on assessment instrument, etc.) • Term to term retention • Program (degree or certificate completion) • Non-Credit to Credit program movement • Transfer or Continuing Education upon program completion • Employment Status upon program completion (1st, 2nd, and 3rd quarters) • Wage data upon program completion (1st, 2nd and 3rd quarters) IV. Action Implementation progress and performance measures will be shared with the Consortium and colleges on a quarterly and annual basis. Based on thoughtful interpretation of data, colleges will undertake steps to ensure continuous program improvement? Use Data & Thoughtful Interpretation For Continuous Improvement

  13. USDOL Evaluation Framework: Participant & Comparison Cohorts NON-TAACCCT-FUNDED UNIVERSE GRANT PARTICIPANT UNIVERSE: Participant Cohort---Subset of Total Grant Participant Universe : Comparison Cohort---Subset of Non-Grant Participant Universe : Matched Participant Cohort and Comparison Cohorts Are Matched On Key Demographic Variables---Age and Gender At A Minimum

  14. Why is DOL Requiring Participant and Comparison Cohorts? • Meets the goal of “Continuous Improvement” • DOL hopes to learn from the cohort information: • Did the TAACCCT program design/ updates/changes that were proposed and implemented have a positive effect on students who went through the new/updated/revised program, as compared with students who did not? • Education retention and completion • Job placement, retention and earnings

  15. Participant Cohort To Comparison Cohort Analysis Will Focus On Key Tracking and Outcome Variables • Program Completion • Retained in Program • Retained in Other Program • Credit Hours Completed • Developmental Skill Improvement • Earned Credentials • Further Education After Graduation • Employment After Graduation • Employment Retention • Earnings

  16. Defining and Building Participant & Comparison Cohorts • Remember, the DOL reporting unit is the Consortium • Each program of study should have its own Participant Cohort and be matched to an appropriate Comparison Cohort (non-grant participants). • Programs of study may be combined (resulting in only one Participant Cohort) if occupational outlook and/or educational requirements are similar. • Combining programs may be a way to meet the matching requirements for a Comparison Cohort or avoid some of the problems with having an invalid Comparison Cohort. • The number of students in the Comparison Cohort must be the same as the number of students in the Participant Cohort. • College student systems will be used to capture demographic and educational characteristics for the Participant and Comparison Cohorts.

  17. YIKES….THERE IS MORE TO THIS THAN I IMAGINED!!!! • DON’T FREAK OUT! • REMEMBER A SIGNIFICANT PORTION OF THIS GRANT FOCUSES ON THE USE OF DATA AND INFORMATION TO CONTINUOUSLY IMPROVE OUR PROGRAMS AND STRATEGIES! • WE ARE HERE TO HELP YOU AND WILL BE WORKING WITH EACH COLLEGE TO BUILD DATA SETS AND WHEN APPROPRIATE PARTICIPANT AND COMPARISION COHORTS.

  18. Next Steps: Campus Visits, Statewide Meetings, Database Construction, and Initial Data Entry Phase 1 Timeframe: 2/20/12 to 9/15/12 • Review DOL and Consortium reporting needs and research/evaluation questions • Meet with individual campus MOHealthWINS teams. • Develop Statewide Data Collection Task Force. Task Force will examine data/information needs and Campus and Agency steps needed to provide required data. • RESPONSIBILITY: Cosgrove & Associates, Campus Teams, and Task Force Representatives • Train campus users in regard to DOL required data elements, and participant and comparison cohorts. • RESPONSIBILITY: Cosgrove & Associates • Develop statewide data system and data element dictionary. • RESPONSIBILITY: Cosgrove & Associates and State Agency Data Partners • Pilot data collection process and tracking system with three campuses. • RESPONSIBILITY: Cosgrove & Associates, State Agency Data Partners and Pilot Institutions • Train campus users on data system and outline data reporting deadlines. • RESPONSIBILITY: Cosgrove & Associates • Begin data collection process. • RESPONSIBILITY: Individual Campuses

  19. Einstein and Berra: Final Thoughts On Data Collection, Evaluation and Program Improvement • Data Don’t Drive. • The important thing is not to stop questioning. • Data collection and research are 90% mental and the other half is physical. • I wish I had an answer for that because I am tired of answering that question. • Finally, you don’t make the pig fatter by weighing it everyday.

  20. Questions and Contact Information Ensure the systematic and timely collection of data required to meet DOL reporting and accountability requirements, and Consortium and individual college’s research and evaluation information needs. Contact Information John Cosgrove jcosgrove27@gmail.com 314.913.6159 Maggie Cosgrove magcosgrove@gmail.com 314.610.2799

More Related