1 / 12

Ethics of Data: Driving Institutional Practices

Join Professor Jacqueline Stevenson from Sheffield Hallam University as she examines the (un)ethical approaches to data collection in educational institutions and discusses the importance of analytical maturity and developing an analytical ecosystem. Learn about involving staff and students, key principles, and the use of Participatory Action Research to effect institutional change.

ddelaney
Download Presentation

Ethics of Data: Driving Institutional Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. On the ethics of data: how are approaches to data collection driving institutional practices Professor Jacqueline Stevenson, Sheffield Hallam University Twitter: ProfJStevenson

  2. Overview of talk • A bit about me • (Un)ethical data collection approaches • Analytical maturity • Developing an analytical ecosystem • Involving staff and students • Key principles

  3. Torture the data, and it will confess to anything Ronald Coase, winner of the Nobel Prize in Economics

  4. (Un)ethical data collection • Lack of transparency • Ignoring data that doesn't fit with what HEI wants to do • Students only involved to 'rubber stamp' • Lack of data communication plan • Data being ‘made’ to perform just political work (reputation, competition)

  5. (Un)ethical data collection • Power/dominance over the questions asked • Collating 'pointless' data • Validity of data • Valorising only certain forms of data • Under-use of data (data silos) • Confirmation vs. curiosity-driven approaches • Lack of predictive modelling

  6. Analytical maturity Challenged • Level 1. Individual level - individuals own and control data and use it to tackle day-to-day functional issues. Firefighting mode, project to project. Little or no support or technology for a culture of evidence. • Level 2. Departmental level - departments take control of their information and start to produce performance reports and metrics for their function; systems are isolated into information silos and not well-aligned at the institution level. Foundational • Level 3. Enterprise-level institution integrates information from across functional areas into an institution- wide information environment with clear support from leadership; reporting and analysis are effective and accurate, and used to make decisions. Clear internal information chain. Progressive • Level 4. Optimisation level - quality data and advanced analytical capabilities used to optimise outcomes across the institution leading to tangible improvements in key functions and metrics. • Level 5. Innovation level - data supports new ways to achieve priorities and enhance success SAS Organization Maturity Model: https://www.sas.com/content/dam/SAS/en_us/doc/whitepaper1/increasing-student-success-with-big-data-in-education-108483.pdf

  7. Developing an analytical ecosystem • Is the system collaboratively planned by all stakeholders? • How does it feed in to annual planning rounds? • What level of granularity is supported? • Do we understand the 'why'? How is qualitative data being fed in? • How is data being used to inform change (and how are these decisions being made and by whom?) • How is data being used to assess 'what works'? (and how is what works understood in relation to sub-groups?)

  8. Involving staff and students • What access do staff and students have to data? • Are they able to fully interrogate and understand it? • What is the culture within which staff and students engage in discussions of data? • How transparent is the data? what is 'hidden' and why? • Are staff and/or students empowered to act on findings? • How are recommendations for action disseminated up and down? • How are decisions about acting/not acting on findings made? and by whom? How can this be challenged? • How is the communication loop closed?

  9. Key principles: institutions • Accept, interrogate and act on data • Responsibility for reviewing data + implementing change to be devolved • Institutional data PLUS formal research and informal dialogue with staff and students. • Monitor student behaviour/performance (not characteristics used to label students ‘at risk’) • Need action plans ready • Evaluate using qualitative and survey methods PLUS institutional data • Effective approaches to be shared with colleagues, especially in cognate disciplines • Students as partners • Thomas and Jones https://www.birmingham.ac.uk/Documents/college-eps/college/stem/using-data-he-strem-transition.pdf

  10. Key principles: students • Individual approaches* • Build students’ capacity to access, analyse, and use data • Enable them to • use data to identify their strengths, weaknesses, and patterns to improve their work. • analyse their own progress. • use data to set goals and reflect on their progress over time • See https://www.srhe.ac.uk/downloads/reports-2016/LizBennet-scoping2016.pdf • Develop Participatory Action Research approaches to effect institutional change • PAR is driven by participants and based on their own concerns. • It is therefore a form of action research which is built on research and action with people rather than simply for people. *Adapted from https://www.kqed.org/mindshift/53426/four-research-based-strategies-to-ignite-intrinsic-motivation-in-students Durham University offers a very helpful guide on PAR and how to develop a PAR approach: https://www.dur.ac.uk/resources/beacon/PARtoolkit.pdf

  11. The whole enterprise of teaching managers is steeped in the ethic of data-driven analytical support. The problem is, the data is only available about the past. So the way we’ve taught managers to make decisions and consultants to analyze problems condemns them to taking action when it’s too late. Clayton M. Christensen, management professor at Harvard

  12. Concluding thoughts • What is our real purpose in gathering data, engaging with students, and thinking about success? • And is it really at the heart of students' best interests? • And if it is not, what should we be doing differently?

More Related