1 / 18

How do you solve a problem like Impact? Summary of survey findings

How do you solve a problem like Impact? Summary of survey findings. Paul Redmond. Head of the Careers & Employability Service, University of Liverpool. Overview. 46 completers (82% heads of service with between 4 – 7 years’ service).

phuoc
Download Presentation

How do you solve a problem like Impact? Summary of survey findings

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How do you solve a problem like Impact? Summary of survey findings Paul Redmond Head of the Careers & Employability Service, University of Liverpool

  2. Overview • 46 completers (82% heads of service with between 4 – 7 years’ service). • Majority (85%) located in student service-related department. • 11% in a teaching-related department • 3.8% in an external-relations department • Spread of universities (post-1992, RG, etc.)

  3. Reporting structure • 70% of heads report to Head of Student Services • 20% report to PVC • Academic deans / heads of T&L = 6.8% respectively

  4. Defining your Service’s purpose • Providing services to students – nearly all cite ‘employability’ services to students as the primary goal, e.g. • “Improving the employability of students and graduates and providing a central point of contact to support CEIAG across the campus …”

  5. Defining your Service’s purpose • Providing services to students – nearly all cite ‘employability’ services to students as the primary goal, e.g. • “Improving the employability of students and graduates and providing a central point of contact to support CEIAG across the campus as well as delivering centrally …”

  6. Rank your main stakeholders

  7. How do you or your HEI measure the responsiveness, profile and visibility of your service with the above groups? • Mixture of internal and external feedback measures • Student numbers / record of users • Referrals from academics • Profile with senior managers • DLHE – institutional comparators (ePI) • Student satisfaction / user surveys (various) • League tables • Involvement of CES in key projects /initiatives • Access to ‘prized’ networks • Employer evaluations • Hits on website • ‘Profile and visibility’ … • Matrix

  8. How do you record and track usage of your Service? • Central recording systems • On-line systems (e.g. Interfase) • ‘Numbers attending events’ records • Mixture of qualitative and quantitative research – headcounts to ‘mystery shoppers’ • We don’t! • Basic headcounts • Monthly MIS surveys / reports • Interview stats • Hits on website • Twitter traffic …

  9. How do you measure your Service’s impact on students?

  10. How do you measure your impact with employers?

  11. Other ways in which you measure your impact on employers • None (several) • Numbers of vacancies filled / quantity and quality of applications • Employer advisory board • Repeat business • Direct feedback from employers in relation to vacancies and internships

  12. Do you or your institution measure the impact of your service or your INSTITUTION’S REPUTATION AND SUCCESS on:

  13. How your careers service impacts on your institution’s reputation and success • Provides comparative data on graduate employment (shows where HEI stands in league tables) • Success in generating funded projects • Income generation • League Tables / DLHE – “Seen as our most important function by senior management, sadly.”) • KPI’s in reports to management • Head produces full-cost recovery study to gauge VfM • Contribution to student retention • “Return on Investment … not sure about how this might be specifically measured …” • “Nothing too explicit ..” … “None.”

  14. How highly are you perceived in your HEI?

  15. What KPIs and / or targets does your Service have? • DLHE – top listed KPI • Employability P.I. • No’s of finalists / graduates seen • Increased numbers using the Service • Student satisfaction surveys • Listed in ‘Top 10’ surveys • Personal recommendation • No. of placements achieved • Matrix • Mixture of externally imposed and internally verified

  16. Final comments … • “I am still at a loss as to how to establish meaningful and real impact measures for most of the work we are engaged in, despite working at the problem for many years! Fortunately, I am rarely required to justify our impact.” • “Be careful of using DLHE as a KPI or measure of impact.” • “Our work does not necessarily have immediate effect – it may take many weeks, months or years for a client to really take on advice. How do we measure the impact of guidance?” • “Collective approach from AGCAS in addressing this is very much welcomed.”

  17. Reflection

  18. The Impact Paradox: By prioritising overwhelmingly services to students, which produce intangible outcomes, careers services find it difficult to measure impact. Those services which can be measured are often not those viewed as key priorities by services. • Location, Location, Location: How crucial is institutional positioning? • Across the sector, wide variations exist (and varying levels of measurement sophistication). • But impact measurement is becoming a widely accepted issue of importancefor most careers services. Nevertheless, methodologies remain limited.

More Related