1 / 31

Multiple Vantage Points for Employment-Related Feedback

SUCCEED. Multiple Vantage Points for Employment-Related Feedback. Share the Future IV Conference March 18, 2003 Joseph Hoey and Jack Marr Georgia Tech. Workshop Coverage. Process for integrating information from multiple sources Role of employer feedback in overall assessment process

lindsay
Download Presentation

Multiple Vantage Points for Employment-Related Feedback

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SUCCEED Multiple Vantage Points for Employment-Related Feedback Share the Future IV Conference March 18, 2003 Joseph Hoey and Jack Marr Georgia Tech Southeastern University and College Coalition for Engineering Education

  2. Workshop Coverage • Process for integrating information from multiple sources • Role of employer feedback in overall assessment process • Longitudinal strategy to measure value added • Methodological considerations in using assessment data • Interpreting assessment data • Making it work at Southern University Southeastern University and College Coalition for Engineering Education

  3. Preliminary Questions • Going into this workshop, what are your concerns about employer feedback? • How would you use feedback data? What process would you personally find most useful? Southeastern University and College Coalition for Engineering Education

  4. Relating College and the World of Work • How can those skills relevant to the world of work be assessed against those skills our students are gaining through their programs of study? • What do our students do that relates to employment? • Internships? • Major-related activities? Southeastern University and College Coalition for Engineering Education

  5. Relating College and the World of Work • How can we connect academic evaluation with employment-related evaluation? • What are the similarities? • How can these be tied together to seek some sort of consistency? • Can indirect information and information derived from students or alumni play a role? • Can general education assessment be connected to the world of work? Southeastern University and College Coalition for Engineering Education

  6. Georgia Tech Employer Feedback Model Southeastern University and College Coalition for Engineering Education

  7. Multiple Sources of Career-Related Performance Evaluation • Co-op Employers • Recruiters • Employers of Alumni/Alumnae Southeastern University and College Coalition for Engineering Education

  8. Co-op Employers • All students are evaluated every term • Can potentially evaluate “value added” • Shows “what’s happening now” Southeastern University and College Coalition for Engineering Education

  9. Recruiters • First interface with post-graduation employment • Substantial Filter • Limited Information and Contact • Biased Sample---Students and Recruiters Southeastern University and College Coalition for Engineering Education

  10. Alumni(ae) • Another filter • In the workplace or graduate school • “Real world” perspective • Very large sample needed to yield department-level data • Undeliverable addresses a problem • Sampling, non-response bias Southeastern University and College Coalition for Engineering Education

  11. Employers of Alumni/Alumnae • Another filter • More opportunity to evaluate • Clearer perspective on what’s important • Greater investment • Other biases, e.g., Sampling is based on employee’s permission. Southeastern University and College Coalition for Engineering Education

  12. Methodological Issues • Survey Methods • Sampling • Response Rate • Sources of Bias • Consistency Southeastern University and College Coalition for Engineering Education

  13. “Triangulation” Co-op Recruiter Alumni(ae) Employer Southeastern University and College Coalition for Engineering Education

  14. “Triangulation” (at best) Co-op Recruiter Alumni(ae) Employer Southeastern University and College Coalition for Engineering Education

  15. Case Study: Comparing and Seeking Continuity in Results • Subject: Computer Information Systems at Very Humid University (VHU) • Just completed first round of assessment studies • Now looking at the data to figure out what they have, what it means, and how they might want to rethink their assessment process to improve its usefulness • You are there as consultants Southeastern University and College Coalition for Engineering Education

  16. Case Study: Comparing and Seeking Continuity in Results • How would you use the data you have to assess the student outcomes stated? • To what extent can you use the data you have to assess the outcomes listed? Do you see any problems? • What changes to assessment methods do you recommend for Computer Info Systems at VHU? • Do you have any other recommendations? Southeastern University and College Coalition for Engineering Education

  17. Employer Feedback: Development of Process • Reworked survey of recruiters to include items relevant to Criteria 2000 • Reworked evaluation instrument completed by employers (supervisors) of co-op students to include items relevant to Criteria 2000 • Created Alumni and Employer instruments Southeastern University and College Coalition for Engineering Education

  18. Process Logistics • Project funded by SUCCEED • Negotiations with process owners • Recruiter and Co-op data collected and entered by office of origin; analyzed by Office of Assessment • Alumni and Employer surveys collected by Office of Assessment Southeastern University and College Coalition for Engineering Education

  19. Findings: Co-op • All ratings moderately high or better, with some variability in spring 2001 • Lifelong learning and technical skills rated highest • Written and oral communication skills rated lowest Southeastern University and College Coalition for Engineering Education

  20. Findings: Co-op • Disaggregated co-op employer evaluations by department and by student class level within department • Breakdown allows clear demonstration of student knowledge and skill gain through the undergraduate experience. Southeastern University and College Coalition for Engineering Education

  21. Findings: Recruiters • Importance: highest ratings on teamwork, problem solving, ability to apply knowledge, communication. • Preparation: highest ratings on using necessary techniques and skills for practice, problem solving. • Largest “performance gaps” over time: teamwork and communication skills Southeastern University and College Coalition for Engineering Education

  22. Results: 2000-01 Bachelor’s Alumni(ae) Survey • Alumni were asked to rate a set of skills, abilities, and attributes generally expected of a Georgia Tech graduate, first rating the importance of each item relative to their personal employment experience since graduation, and then rating each item relative to how well their education had prepared them. Southeastern University and College Coalition for Engineering Education

  23. Alumni Results: Importance vs. Preparation • There were 6 specific skill areas for which there was a greater than 0.50 difference between mean ratings for importance and mean ratings for preparation: The ability to… • communicate orally, • communicate in writing, • function on teams, • use computing technology in communications, • engage in lifelong learning / self-critique, and • exercise leadership skills. Southeastern University and College Coalition for Engineering Education

  24. Hands-On Activity • Divide into small groups. • Discuss: How could you structure and unify employer feedback at Southern U.? Each group should put together ideas on: • what information is needed • what methods would be appropriate to use • who would have ownership/need to be involved • who would collect, enter, analyze, report data • how best to communicate and use results Southeastern University and College Coalition for Engineering Education

  25. Hands-On Activity • Presentation and Discussion (10 minutes): Small groups present summarized ideas; larger group discusses. Southeastern University and College Coalition for Engineering Education

  26. What Have Previous Participants Thought?* • What information is needed • What methods would be appropriate to use • Who would have ownership/need to be involved • Who would collect, enter, analyze, report data • How best to communicate and use results *ASEE 2001, Albuquerque Southeastern University and College Coalition for Engineering Education

  27. What Information Is Needed • Criterion 3, A–K knowledge, skills, and abilities—Importance and preparation • How do our students compare with others? • How do our students compare by levels? • Degree level • Field level • Are we preparing students appropriately? • What is important • Where are we succeeding • Where are we falling short • What is our minimum acceptable performance? — Measure the “low end” of graduates Southeastern University and College Coalition for Engineering Education

  28. What Methods Would be Appropriate to Use • Surveys • Interviews • Telephone • Personal • Advisory committees Southeastern University and College Coalition for Engineering Education

  29. Who Would Have Ownership/Need to be Involved • Students • Faculty • Recruiters/employers • Employees (grad/co-op) • Parents • Institution Southeastern University and College Coalition for Engineering Education

  30. Who Would Collect, Enter, Analyze, Report Data • Involve team (students, faculty, alumni) • Analyze and collect (survey specialists) • Report (faculty, editors, specialists) • Distribute based on type • Co-op/Career Planning and Placement • Dean’s office • Computer Services • Institutional Research • Individual departments Southeastern University and College Coalition for Engineering Education

  31. How Best to Communicate and Use Results • Share survey data results and summary information with • Students • Faculty • Industry Advisory Boards • Employers • Alumni • Format data results and summary information with • Statistical analyses • Bar graphs • Trends • Tracking—year-by-year and by class • Disaggregate to department level Southeastern University and College Coalition for Engineering Education

More Related