html5-img
1 / 21

Michigan College Access Network Developing an Evaluation Plan

Michigan College Access Network Developing an Evaluation Plan. Nathan Daun-Barnett, PhD April 13, 2011. Outline for Evaluation Presentation. Introduction of Programs Understanding the Barriers to College Access Developing a Theory of Change Utilizing a Logic Model Building a Data Plan.

vallari
Download Presentation

Michigan College Access Network Developing an Evaluation Plan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Michigan College Access NetworkDeveloping an Evaluation Plan Nathan Daun-Barnett, PhD April 13, 2011

  2. Outline for Evaluation Presentation • Introduction of Programs • Understanding the Barriers to College Access • Developing a Theory of Change • Utilizing a Logic Model • Building a Data Plan

  3. What We Know About College Access

  4. Levels of College Access Intervention State Policy MiCAP Know How 2 Go Social Marketing Community LCANs Promise Zones College Goal Sunday College Access Centers Institution School Practices MCAC College Outreach Individual/Family Preparation Navigating Process Affordability Finding Support

  5. Academic Preparation • What do we know? • Better prepared students go to college at greater rates and are more likely to succeed • High demand for remediation limits chances for success • Approaches to the challenge • Rigorous High School Curriculum • Expanded Dual Enrollment Opportunities • ACT/MME • After School Tutoring & Mentoring

  6. College Affordability • What do we know? • Low income students less likely to attend • Low SES also most sensitive to price in terms of choice • What we do? • Need-based Aid (Pell) • Merit-based Aid (former MI Promise) • Tuition Policies & Promise Zones • GI Bill and Veteran’s Benefits &TIP

  7. Student Aspirations & Support • What do we Know? • Parents, peers and counselors influence aspirations and plans for college • Aspirations are high for all, but plans differ by race and SES • What do we Do? • MI Know How 2 Go • GEAR UP programs • College Positive Volunteers • NCAC College Advisors

  8. Navigating the Process • What do we Know? • The process has grown increasingly complex • Efforts to facilitate the admissions process increases the likelihood of applying and receiving aid • What can we Do? • MiCAP Web Portal • College Goal Sunday/Tax Preparation • College Visits • College Success Centers

  9. Theory of Change

  10. Develop a Theory of Change • Identify range of activities for your program • Link each activities to the barriers you believe are being addressed - including academic preparation, college affordability, social capital (aspirations and plans), and navigating the process

  11. If these benefits to your programs are achieved, then certain changes in organizations, communities, or systems might be expected to occur If you accomplish your planned activities, then you will hopefully deliver the amount of product and/or service that you intended If you accomplish your planned activities to the extent you intended, then your programs will benefit in certain ways If you have access to them, then you can use them to accomplish your planned activities Certain resources are needed to operate your program Resources/ Inputs Activities Outputs Outcomes Impact Your Planned Work Your Intended Results

  12. Logic Model - College Positive Volunteer

  13. Activities

  14. Outputs

  15. Reporting Outputs • This is an important set of formative measures and it is part of what we need to report to the federal government on the CACG • What are the anticipated outputs of your activities? • What are your expected goals for each of those metrics? (this serves as a point of comparison or baseline for the same measures one year later)

  16. Formative Outcomes

  17. Summative Outcomes • The question we need to ask is “compared to what?” • Approach 1 - Compared to self at an earlier point. • Approach 2 - Compared to other similar students at same point.

  18. Impact In order to measure impact, we must ask, Did our program improve the likelihood high school participants went to college?

  19. How do we Measure Impact? Approach 1 - randomly assign students to the program Approach 2 - compare program students to a similar group of students (match sample) Approach 3 - statistically control for the factors accounting for the differences between groups Approach 4 - statistically control for selection bias

  20. For more information, contact: Nathan Daun-Barnett nbarnett@buffalo.edu 716-645-1096

  21. Developing a Data Plan Identifying Sources of Data • National Student Clearinghouse (enrollment, persistence, transfer, and completion) • School grade reports (transcripts) to capture grades, course completions, academic progress, attendance and behavior • ACT data includes both test scores for ACT, Workkeys, and MME in addition to student survey • MiCAP usage data (useful if part of intervention) • Student surveys to measure learning and behavior outcomes • Administrative records of program usage

More Related