1 / 58

Facilitated by: Marian Doub, Associate, Friedman Associates

Best Practices in Measuring Success & Demonstrating Outcomes for Microenterprise Development Program!. Facilitated by: Marian Doub, Associate, Friedman Associates Jason Friedman, Principal, Friedman Associates Guest Practitioners:

keaton-knox
Download Presentation

Facilitated by: Marian Doub, Associate, Friedman Associates

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Best Practices in Measuring Success & Demonstrating Outcomes for Microenterprise Development Program! Facilitated by: Marian Doub, Associate, Friedman Associates Jason Friedman, Principal, Friedman Associates Guest Practitioners: • Nancy Swift, Executive Director, Jefferson Economic Development Institute (JEDI), Mt. Shasta, CA • Alex Forrester, Chief Operations Officer, Rising Tide Capital, Jersey City, NJ Hosted by Little Dixie Community Action Agencyand funded in part by the U.S. Small Business Administration Program in Investments in Microentrepreneurs Program (PRIME).

  2. Introduction to Friedman Associates & Webinar Instructions • As a community-based organization that helps low-wealth individuals and communities build wealth, create jobs and small businesses, your work is essential to the nation’s economic recovery. • The mission of Friedman Associates is help you achieve your vision for a sustainable and economically vibrant community – and demonstrate the results that lead to increased funding and long-term success. • Areas of specialization include product development and staff training in microfinance and small business lending; business development services; systems for client tracking and program performance; strategic planning, board development and fund development strategies.

  3. Presenter: Marian Doub • One of the nation's top specialists in integrated systems for monitoring and evaluating microenterprise development programs. Certified as a MicroTest trainer by the Aspen Institute. • Research and Evaluation Manager for Women’s Initiative for Self Employment (S.F. & Oakland, CA) from 1998-2004. • Developed practical systems for promoting best practices and innovation with dozens of microenterprise and community econ. development providers and intermediaries (Aspen Institute, MicroTest, LISC, NeighborWorks America). • MA in Urban and Environmental Policy from Tufts University’s Department of Urban and Environmental Policy & Planning in Medford, MA. 3 3

  4. Guest Presenter: Nancy Swift, ED & Program Director, JEDI • 19 year veteran to the field and has served as a practitioner, advocate and visionary. • The Jefferson Economic Development Institute is an award-winning microenterprise and asset development corporation located under the Mt. Shasta volcano in northern most frontier region of California. • Serves approximately 350 people annually who are primarily low income and are most likely to be women, Native American, African American or have a disability. • Since 1996, has assisted over 4,500 people to create 1365 jobs or new businesses. • Nearly 50% of businesses increased their revenues by 50% after working with JEDI an average 1.5 years

  5. Guest Presenter: Alex Forrester, COO, Rising Tide Capital (RTC) • Co-founded RTC based in Jersey City, NJ in 2004 with fellow Harvard classmate Alfa Demmellash. • Serves as Chief Operations Officer, with primary oversight on financial management, institutional fundraising, grants management, outcome measurement, and technology. • Core programs: The Community Business Academy (40-hour training course) and Business Acceleration Services (year-round coaching and seminars). • 280 graduates of CBA since Dec 2006. 100 currently in business; 128 in the planning stages. • CEO Alfa Demmellash was selected in 2009 as a CNN Hero; Recognized by President Obama in White House speech in June 2009.

  6. Our Agenda for Today • Measuring Success —why now more than ever? • Practical Paths for Measuring Success— • Most Basic/Pre-Standard • Basics+ Industry Standards, • Integrated Systems • Resources, Process & Products • Strengths and Weaknesses • Examples from the field • National standards, benchmarks • Q&A and Action Planning 6

  7. What’s Going On in the Big Picture for MDOs? – External Factors • The pressure is on to ‘make the case’ for programs that support micro- and small businesses in competitive funding and policy environments. • Demand for accountability and evidence of track records and outcomes results (what happens during and after services) is also at an all time high. • Strategic demand for greater MDO scale and results. • Increasingly MDOs must prove and improve program results by tracking outcomes. • Early adopters of robust program performance & outcomes systems are now accepted leaders in the field. 7 7

  8. What’s Going On in the Big Picture? – Internal Factors • Weak or minimal data and knowledge management systems severely limit MDO attempts to assess and improve performance, attract funding and, demonstrate their relevance to the community. • Where do I start? What can I expect? How do I plan for and implement these systems? • Limited resources for building, sustaining and optimizing data & knowledge management systems—very few capacity building resources exist. • Where do I turn for resources? 8 8

  9. What is Measuring Success? • Regular and systematic use of information about program performance and outcomes to prove and improve results. • The regular, systematic tracking of the extent to which program participants experience the benefits or changes intended by the Mission (United Way, 2002). • Measuring Success process fosters continuous learning and evidence-based change to prove and improve program results during data definition, collection, entry, storage, and use. 9

  10. Measuring Success answers 3 kinds of questions:* • How is our program performing? • How are our clients doing? • Are we achieving our Mission? *Thank you to the Aspen Institute’s MicroTest Program for material on this page.

  11. Measuring Success Uses3 types of Monitoring and Evaluation Data: • Program Performancequestions can be answered with data that an ME program needs to collect and maintain in order to function: • In client contact database, in loan portfolio management systems, in accounting systems, etc. • Client Outcomesquestions can only be answered by going outside the program and surveying clients who have received substantial services and have had time to put what they learned/received into use. • Program Impact measures how strongly the outcomes are related to the program experience using a control group, statistical tests, large sample sizes, or data gathered at set points in time over a long period. *Thank you to the Aspen Institute’s MicroTest Program for material on this page. 11

  12. Standard Client Outcomes Indicators • Most commonly required indicators of success are outcomes—occur a year or more after a loan or training—and prove our role in economic development and recovery: • New businesses (start ups) (HUD/CDBG, SBA, MT) • Jobs created and retained (HUD/CDBG, SBA, HHS, MT) • Annual business revenue increases (HUD/CDBG, SBA, HHS, MT) • Existing businesses stay in business (survive) & thrive • Business profitability • Owners improve their personal and household financial stability and security • Serving distressed, underserved communities: women, people of color, un- and under-employed, etc. 12 12

  13. Why Measure Success? • Strategic visioning and decision making • Program planning (innovation, goal setting) • Fundraising (proposals and reports) • Program management (decisions, work flow, customer service) • Monitoring and evaluation (are we meeting the needs and achieving our Mission) • Communicating results to clients, Board, community, policy makers, and other supporters 13 13

  14. What is Your Path to Measuring Success? • Why & How Much Do You Measure Success? • How Do You Best Know and Use Your Results? • Gather and organize information that can help you improve program management and decision-making • Make the case well • Improve work-flow • What do you need to know vs. want to know? • What is your return on investment/value proposition for Measuring Success? 14 14

  15. 3 Common Paths for MDOs: Most Basic/Pre-Standard • Most Basic: Use data management system (database, data collection tools) for client contact information, demographics, and basic program activity • Fulfills basic reporting, program management requirements. • Databases: Excel spreadsheets, Contact Relationship Management (CRM) Systems, Loan Fund Management. 15 15

  16. 3 Common Paths for MDOs: Basic Industry Standard • Basics+ Industry Standards: Data management system and adoption of MicroTest Program Performance and Outcomes Standards and Tools • Meets the standards for the MDO industry and provides outcome monitoring results on an annual basis. • MicroTest membership includes tools and customized reports.

  17. 3 Common Paths for MDOs: Integrated Measuring Success • Use of Mission-driven outcomes throughout program services and data management system. • Provides Mission-driven, just-in-time information about program performance and outcomes to continuously improve results for clients and staff. • MIS/Database must manage historical, relational data—changes over time for clients and their businesses. • Produces MicroTest results • Resource intensive to transition. • Efficiency and effectiveness improves as data integrity and use (analysis) improve. 17 17

  18. What are the Pros and Cons of Each Approach?

  19. Data Integrity 19 19

  20. Data Use 20 20

  21. Start Up Resources 21 21

  22. Integrated Measuring Success: The JEDI and RTC Experiences • What is your return on investment/value proposition for deciding to integrate Measuring Success-outcome tracking—throughout your organization? • Why & how much do you hope to Measure Success? • What were your systems like when you started? • What have you done so far? Where are you in the process? • What has changed for your organization?

  23. Basic Industry StandardBest Practice Resource • MicroTest Performance and Outcomes • www.microtest.org

  24. What is MicroTest? • An initiative of the Aspen Institute’s FIELD Program. • Management tool that empowers microenterprise practitioners to gauge and improve the performance of their program and the outcomes of their clients. • Practitioner-built tools and protocols for collecting and using data to answer the 3 important questions. • Uses standard indicators and metrics to document and define standard and top performance for the MDO field in the U.S. in aggregate and peer groups. • Active peer-group of microenterprise development programs monitoring performance and outcomes.

  25. MicroTest is…

  26. MT Performance Workbook Basics What is it? The MT Performance Workbook is a set of linked excel worksheets that gathers key information on your microenterprise program’s training and credit activities and provides immediate feedback on the costs, efficiency and sustainability of those activities. Plus, the integrated custom report allows you to see how your program is changing over time, how it compares to other similar microenterprise programs, and how it compares to “top performance” in the industry. Why do people use it? The MT performance workbook provides information crucial to adapting and refining program services and assembling winning grant proposals. Programs that complete the workbook also cite an expanded data collection and analysis capacity within their organizations as a key reason for participating. How is the MicroTest performance workbook unique? MT defined the set of standard measures accepted by the microenterprise industry which allows you to hone in on your microenterprise organization’s performance and discuss this performance using terms and definitions the industry agrees on. TA from MT staff allows the data to really mean something and be used in a productive way for the program. FIELD - The Aspen Institute

  27. MicroTest Program Performance Custom Report Interactive Features of the custom report allow you to further personalize the document for your program. Look at your program’s progress over time using trend data for all 50 MT measures. Compare your program’s performance to those MT programs achieving Top Performance for key measures. Compare your program’s performance to your peers. 27 FIELD - The Aspen Institute

  28. JEDI ‘data treasure’: MicroTest Performance Report • What do you need to know vs. want to know? 28

  29. MT Outcomes Workbook BasicsAdapted from: http://fieldus.org/Microtest/OutcomesDetails.html What is it? MicroTest Outcomes Workbook is a series of linked Excel worksheets—including intake (baseline) and survey (outcomes)—with an accompanying Instruction Guide. MicroTest staff provide data cleaning, analysis, custom report, and technical assistance. Analysis and custom report includes: non-response bias, dashboard, overview, longitudinal analysis of results. Why do programs use it? • Designed to answer key questions about clients’ business and household outcomes: Are the clients in business? Are the businesses growing? Creating jobs? Does the business contribute income to the household? Are clients moving out of poverty? • The indicators are few and focused on key questions managers must constantly answer with respect to program effectiveness. The data collection process and analysis is simple. • The approach is a way for programs to monitor outcomes - it is not outcomes assessment or evaluation. • The MicroTest Outcomes Workbook and Instruction Guide are updated every year based on members’ feedback.

  30. JEDI ‘data treasure’: MicroTest Outcomes Report 30

  31. Integrated Measuring Success(Path Three) Best Practice • Process & Products

  32. Measuring Success Four Practical Stepsfor Building Integrated Systems • Step One:Name and DefineSuccess What does your Mission-driven success look like? • Purpose: Mission-driven goals & framework for describing program and longer term benefits (measures of success) • Tools: Theory of Change, Logic Model, Stakeholder Interviews, Data Treasure Hunt, Benchmarks Literature Review 32

  33. Name Your Program Success on your terms…key outcomes indicators • Mission:JEDI increases the economic well-being of people and communities through business development and local wealth creation. JEDI succeeds when entrepreneurs succeed in one or more years to: • Start businesses; • Strengthen, formalize, and expand businesses; • Establish and maintain profitable businesses; • Create and/or retain employment for themselves and others; and • Improve household financial security.

  34. Name Your Program Success on your terms…two tools others use 34

  35. JEDI’s key outcomes indicators JEDI knows businesses are successful when in 2 to 6 years: • Income from business contributes to household financial self-sufficiency and security; • Business revenue allows owners and other employees to increase purchasing of goods and services (household & business-to-business spending) and increase the income tax base; • Businesses provide local communities with needed goods and services; • Businesses attract regional markets and investment; • Businesses provide local communities with cultural and social assets; and • Owners give back and reinvest in local community.

  36. JEDI’s key outcomes indicators defined:JEDI SUCCEEDS WHEN CLIENTS SUCCEED IN ONE OR MORE YEARS TO…

  37. RTC ‘data treasure’: Quarterly Dashboard Report (mock up/draft) • What do you need to know vs. want to know? RTC Quarterly Dashboard (mock data)

  38. RTC ‘data treasure’: Quarterly Dashboard Report (mock up/draft) • What do you need to know vs. want to know? RTC Quarterly Dashboard (mock data)

  39. RTC ‘data treasure’: Quarterly Dashboard Report (mock up/draft) • What do you need to know vs. want to know? RTC Quarterly Dashboard (mock data)

  40. Measuring Success-Four Practical Stepsfor building integrated systems • Step Two: Assess and Align Internal Capacity to Manage the Data You Need • Purpose: Design & resource your MIS—data collection, entry, storage, use—to measure and result in Mission-driven success. • Tools: Monitoring & Evaluation Plan; Data Fields Inventory, Audit and Alignment; Data collection tool templates, pilot • 1. Assess and select outcome tracking resources & method(s); 2. Inventory/audit of baseline and outcome fields/questions in databases, data collection tools, reports; • 3. Revise forms, databases, reports, processes as needed 40

  41. Data Collection for Measuring Success—A Good Intake Tool… • Sets the baseline questions for measuring long term outcomes—the questions are asked and tracked throughout the system in almost exactly the same way at intake, on update forms, and surveys. • Encourages the program staff and clients to assess and update progress on a regular basis—it is useful for more than data. • Asks for 1-2 other contact people to help stay in touch. • Records the data event, collection, and entry dates as well as staff names. • ‘Translates’ evaluation & metrics into a language everyone can use. • Communicates clear guidelines for information use. 41 41

  42. Data Collection for Measuring Success— RTC Program Applications/Intake Forms • Baseline • Goals • Scope of Work/Terms of Service Agreements 42 42

  43. Data Collection for Measuring Success – In Action RTC One-on-One Coaching form RTC Outcome Update form (for staff) JEDI Outcome Update form (for staff) JEDI Client Action Plan Encourage Staff and Clients to achieve & report Mission-driven results 43 43

  44. Data Collection for Measuring Success—A Good Survey Tool is… • Administered no more than 1x a year by phone, web- based, or mail-in. Phone is usually most successful. • Uses incentives—raffle of something useful to business owners—for those who complete survey. • Administered by trained staff or volunteers who are not direct service providers. • Looked forward to once the ‘check-in’ practice is established. • Does not take the place of program evaluation or experimental research. 44 44

  45. Data Collection for Measuring Success JEDI Phone Interview Survey Tool 45 45

  46. Data Collection for Measuring Success RTC Online Survey Tool 46 46

  47. Measuring Success-Four Practical Stepsfor building integrated systems • Step Three: Redesign and Implement Data Management System • Purpose: Mobilize and implement MIS and human resources to support measuring success. • Tools: Database Needs Assessment; Database Product Assessment & Selection; Database re-engineering (configuration) & transition (conversion of existing data); • MIS for Microenterprise: A Practical Approach to Managing Information Successfully • http://fieldus.org/Publications/MISManual.pdf • Goal is to increase efficiency and effectiveness of internal systems—eliminate parallel data sources, duplicate data entry, time intensive reports, etc. • Dedicate enough resources—time, staff, money—to do this project well—this is a long term investment. • Document! 47

  48. Step 3: Select MIS Products for Tracking Program Data—including Outcomes Promising Options • VistaShare Outcome Tracker; • WebCATS (Client Activity Tracking Software, primarily SBA grantees; • The Exceptional Assistant, by Common Goals Software; • Salesforce (with other platforms for outcome tracking (MicroTest Outcomes or Success Measures); • Efforts to Outcomes/Social Solutions; • Money In, Money Out, Technical Assistance (MIMOTA) by • Villagesoft; • Portfol • Applied Business Software • Others? 48 48

  49. Measuring Success-Four Practical Stepsfor building integrated systems • Step Four: Use and Sustain the Results & System • Purpose: Use Mission-driven goals, framework, information for to prove and improve program success. • Tools: Adjust staffing plan/job descriptions to include data analyst, collection, entry, management/coordination, reporting; Report/use design, pilot, plan; Staff training & TA/support (ongoing); Operations manuals • Program Management: Scorecards, Dashboards, regular reports • Public Relations/Fundraising Materials • Learning Circles-highlight and explore strategic issues using results with staff, clients, Board, communities • And Many More… 49

  50. Why Do You Measure Success? JEDI ‘data treasure’: Fact Sheet 2009 50

More Related