1 / 30

Training Analytics

Training Analytics. Stephanie Lewis, Cytovance Biologics Inc. Objectives. Goal to bring our shared experience to helping each other with ways to track and provide training analytics. Identify 5 levels of evaluation. Identify three steps of running training analytics.

brina
Download Presentation

Training Analytics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Training Analytics Stephanie Lewis, Cytovance Biologics Inc.

  2. Objectives Goal to bring our shared experience to helping each other with ways to track and provide training analytics. • Identify 5 levels of evaluation. • Identify three steps of running training analytics. • Apply key skills to case studies.

  3. Definitions Circle or Underline words that jump out to you in the definition Program Evaluations - A systematic method for collecting, analyzing, and using data to answer questions about projects, policies and programs, particularly about their effectiveness and efficiency. Learning Analytics – the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

  4. Definitions Program Evaluations - A systematic method for collecting, analyzing, and usingdata to answer questions about projects, policies and programs, particularly about their effectiveness and efficiency. Learning Analytics – the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

  5. Steps in Running analytics • Step 1 = Plan for Analytics • Find out what stakeholders want (needs analysis) • Agree upon the objectives (outcomes of the training) • Step 2 = Collect Data • Step 3 = Reporting the data to stakeholders

  6. Case Study - Planning • In groups of 3-4, read the case study and answer the following questions. • Who are your stakeholders and what do they want from the program? • What are some key objective that should be used? • Who do you need to “sell” on the program? How would you sell it? • What would you include in the costs of creating this program? • What would you include in the benefits of the program? • Just briefly, what kind of analytics do you think you would provide for this training?

  7. Planning : What do Stakeholders want Generally, stakeholders want different things.

  8. Planning : Needs Analysis Needs Analysis: • Who are the stakeholders? What do they want/expect? • Why this program? Is training the best solutions for this? (Could they do “skill” if their life depended on it?) • What will likely be the participants reaction to this program? • What cost challenges are we facing? • What are the primary issue driving the program (cost reduction, time reduction, performance, etc)? • What are the learning objectives? • What results will the program need to meet? What else is involved in meeting the results? • What should our evaluation strategy be?

  9. Analysis Levels Level 0 - Level 1 – Level 2 – Level 3 – Level 4 – Level 5 – Inputs into process (costs) Participant reactions (did participants like the program) Knowledge learning (principles, facts, processes, procedures, techniques) Behavior / application (use of skills, OJT change in behavior) Business impact (business measure improved) Return-on-investment (monetary benefits of the program)

  10. Program Objectives • _______ • _______ • _______ • _______ • _______ • _______ • _______ • _______ High-Level Program Objectives and Levels • Increase general science knowledge by 20%. • Achieve a 20% ROI one year after program. • Use six step process in 70% of customer complaints. • Achieve 85% on product exam. • Decrease amount of time to autonomy from 6 months to 3 months. • Decrease preventable Incident Rates of new hires by 30%. • Reduce costs of errors by 20%. • __________________________________________________________________

  11. When to run levels 3, 4, or 5 analytics When programs are.. • Believed to have a big impact • Analytics requested by stakeholder • Controversial topics or outcomes • Big investments that may be scrutinized • Programs that could have a big impact to culture • ____________________________________________ • ____________________________________________ • ____________________________________________

  12. Planning : Agreed Objectives • Example from Case Study • Created a cost / benefit analysis. • Interviewed team members and finding out what they wanted. • Designed a basic body of knowledge and skills that managers from every department were teaching as one-offs to each new hire. • Got managers to agree on objectives.

  13. Steps in Running analytics • Step 1 = Plan for Analytics • Find out what stakeholders want (needs analysis) • Agree upon the objectives (outcomes of the training) • Step 2 = Collect Data • Step 3 = Reporting the data to stakeholders

  14. Step 2 – Data Collection • Level 1-3 are straight-forward to collect. Usually surveys or evaluations. • Level 1 – Reaction evaluation • Level 2 – Pre-and post-test • Level 3 – Application evaluation • What data do I have to achieve the desired level of analysis (especially level 4-5)? • To show a reduction in errors we must have a means of measuring current tracking. • To show in increase in knowledge I must provide a baseline of knowledge. • How can we translate soft data into hard data? • A specific leadership behavior drive a specific cost? • From exit interviews what percent of turnover is due to management communication practices. Turnover to dollars.

  15. Step 2 – Data Collection HARD DATA (Easier to measure and assign $) SOFT DATA (Harder to measure and assign $) • OUTPUTS: • New accounts • Completion rates • Loans approved • Tasks completed • _______________ • COSTS/REVENUE: • Budget variances • Cost by account • Program costs • Unit cost • Profits • Sales increase • _______________ • TIME: • Overtime • On-time results • Time to proficiency • Efficiency • _______________ • QUALITY: • Error rates • Rework • Waste/Scrap • Incidents • _______________ • BEHAVIOR: • Competency • Follow safety rules • Comm. improvements • _______________ • ENAGEMENT/SATISFACTION • Customer complaints • Employee / customer satisfaction • Loyalty • Diversity • _______________ • DEVELOPMENT • Capabilities • Promotions • Readiness • Leadership • _______________ • MARKETING • Innovation • Brand/ Reputation • Awards • _______________ Insights taken from ROI Institute

  16. Step 2 – Data Collection Methods • Level 1 - surveys, questionnaires, ___________ • Level 2 – surveys, tests, program activities, ______________ • Level 3 – surveys, observation on the job, interviews, focus groups, actions followed up, ___________ • Level 4 – business metrics, questionnaires, action plan followed up, performance contracting, follow-up sessions, performance records, _________________ • Level 5 – translate the data to dollars

  17. Data Collection Planning Activity You are planning a Meeting Management Training. You want to run a Level 5 analysis on the training program, so you are planning how to do that using the planning sheet provided. Let’s walk through level 1 together and then with a partner you will work through the rest.

  18. Program Data Planning Sheet: Training on Meeting Management Insights from ROI Institute

  19. Program Data Planning Sheet: Training on Meeting Management Insights from ROI Institute

  20. Translate Business Impact to $$$$

  21. Steps in Running analytics • Step 1 = Plan for Analytics • Find out what stakeholders want (needs analysis) • Agree upon the objectives (outcomes of the training) • Step 2 = Collect Data • Step 3 = Reporting the data to stakeholders • Make sense of the data • Report to stakeholders what is appropriate

  22. Running Tally Sheet

  23. Percentage Sheet

  24. Email report to all stakeholders - monthly Step 3 – Reporting Data to Stakeholders • Provide pilot feedback. • Send an email with key data after each class or monthly. • Have a 30 day follow up survey from managers. • Run stats on key performance indicators. • Provide data for what you “sold” to the stakeholders. • ________________________ • ________________________

  25. If time allows Testimonial Example

  26. Testimonial Key points: • Develop a manufacturing training from scratch • Buy or Build model • Cost / Benefit analysis • Develop a plan for program analytics

  27. Benefits of Buying

  28. Comparison of Cost Buy vs. Build Attend these workshops in the public offerings: $2300 per person It is our opinion that it would take 4-8 years before a program built in-house even begins to add as much value as partnering with a provider. *Estimates of costs: E-learning Development ($___ per class x 3 classes) = $____. SME time ($__ x 240 hours x 3 classes) not including lost time in manufacturing. $___________. In-class Development Costs Estimated costs considered:Developer ($___ x 1000 hours) = $_________; SME time ($__ x 1000 hours) = $_________

  29. Part A Data Collection • Pre- and Post- Test • Training Effective Eval. • Part B Data Collection • Reduced IR of New Hires • Translate those IRs into cost savings • Time to autonomy (mgr. reported data pre and post) • Translate into costs • Evaluate “barriers” or other factors of trained employees in using these skills on the floor

More Related