Evaluation dissemination
Download
1 / 58

Evaluation Dissemination - PowerPoint PPT Presentation


  • 170 Views
  • Uploaded on

Evaluation & Dissemination. Martin Oliver & Grainne Conole. An overview of the afternoon. A recap of EFFECTS & of the topics for today An introduction to evaluation and dissemination An emphasis on using these as strategic and political tools

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Evaluation Dissemination' - bracha


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Evaluation dissemination l.jpg

Evaluation & Dissemination

Martin Oliver & Grainne Conole


An overview of the afternoon l.jpg
An overview of the afternoon

  • A recap of EFFECTS & of the topics for today

  • An introduction to evaluation and dissemination

  • An emphasis on using these as strategic and political tools

  • Asking for you to think about experiences and understanding, and using these to generate ideas, principles, etc. (A file with this information will be circulated afterwards.)

  • Hand out some papers & these slides at the end


Setting the scene l.jpg
Setting the scene

  • In 1997, professional development of academics was put on the national agenda

  • A group of people got together to think about how to develop academics to use new technologies…

  • 1998: “EFFECTS (Effective Frameworks for Embedding C&IT with Targeted Support)” project starts with TLTP funding


Setting the scene4 l.jpg
Setting the scene

  • Committed to a number of values, including

    • Local implementation (not “one size fits all”)

    • Scholarship (don’t just do it, think about it)

    • Consultation and discussion (this isn’t just about us)

  • Reflected in the project outcomes

    • Strong commitment to evaluation

    • Strong commitment to research

    • Track record of dissemination

  • Also incorporated into learning outcomes


Setting the scene5 l.jpg
Setting the scene

  • We wanted staff who were informed, not just trained

  • We wanted to help people develop (not do things to them)

  • A slogan emerged from one project meeting (and made it onto the cover of the final evaluation report)

    “EFFECTS – not so much a framework as an agenda for change”

  • This sense of political action was important – a theme for today’s seminar


Setting the scene6 l.jpg
Setting the scene

  • Two EFFECTS outcomes for this afternoon:

    Outcome 5: Evaluated impact of the interventions

    • This will include evidence that you have:

    • Evaluated the impact of the incorporation of technology on students and colleagues. Maintained anawareness of external changes and made adaptations as necessary.

      Outcome 6: Disseminated the findings of the evaluation

    • This will include evidence that you have:

    • Provided feedback for students and colleagues and disseminated experience and findings to department ormore widely.


Before we give you information l.jpg
Before we give you information…

  • We want to start by getting you thinking about evaluation and dissemination

  • First, think about different times when you’ve been involved in evaluation (either doing it or having it done to you!)

  • Then, write down:

    • A brief description of the best experience you’ve had with evaluation

    • A brief description of the worst experience you’ve had with evaluation

  • We’ll spend about five minutes on this


But what is evaluation l.jpg
But what is evaluation?

  • It’s not one “thing” – so there’s no single definition

  • However, a useful starting point is:“The process of making judgements about the worth (costs and values) of something”

  • Also used to describe…

    • Descriptive studies

    • Intervention studies (e.g. formative evaluation)

    • Empirical research

    • Monitoring

    • Quality Assurance processes


But what is evaluation9 l.jpg
But what is evaluation?

  • What do these things have in common?

    • Evaluators. “Evaluation is what evaluators do.” (A community of practice kind of definition)

    • Empiricism. Evaluation involves judgements about data.

    • For the most part, judgement (the “value” in “evaluate”)


But what is evaluation10 l.jpg
But what is evaluation?

  • And what do these activities look like?

    • Data collection of various types

      • Interviews, surveys, finance spreadsheets, focus group transcripts, documents, tallies of promotions, emails…

    • No inherent reason why it has to be done a particular way

      • Evaluation doesn’t have to involve interviews, or randomised control trials, or…

      • Choice of method depends on the person, the situation, the audience and so on


Utilization focused evaluation l.jpg
Utilization-focused evaluation

  • Evaluation in EFFECTS followed a particular tradition: utilization-focused evaluation (Patton, 1997)

  • A philosophy that arose from:

    • The realisation that no-one read evaluation reports

    • The feeling that the qualitative/quantitative paradigm war was never going to be ‘solved’

  • Argues that evaluations should be judged against how well they help people to do things

    …and not on whether they are ‘good’ technically (e.g. ‘valid’), of a particular type (e.g. ‘experimental’) or by particular types of people (e.g. external evaluators)


Utilization focused evaluation12 l.jpg
Utilization-focused evaluation

  • Communication is key to this kind of evaluation

    • No matter how ‘good’ the study, it’s useless if the people who need the information don’t get it in time

  • Closely related to ideas of democratic and emancipatory evaluation

    • We need to understand how other people understand this situation

    • We need to get groups (e.g. policy makers, academics) talking to each other

    • Evaluation can provide a framework in which this can happen


Now back to you l.jpg
Now back to you!

  • Spend a short while thinking:

    • Has this made you think of any other experiences of evaluation?

    • Do these ideas help you to make sense of your experience of evaluation at all?

  • And spend a short while talking:

    • With a convenient group of people, compare your experiences

    • Why were these good or bad? (What feature or quality made it a good or bad experience?)

  • We’ll gather some suggestions from you after a few minutes


But we want more l.jpg
But we want more…

  • Think about any involvement you’ve had with project dissemination

  • As before, write down:

    • A description of the best experience you’ve had

    • A description of the worse experience you’ve had

  • Take a few minutes to do this on your own


And what s dissemination l.jpg
And what’s dissemination?

  • Sounds like a stupid question

  • But… if we’ve rejected the idea we can ‘transmit’ learning to students, why do we persist in thinking we can just ‘transmit’ findings to peers?

  • This is not about volume (how many people, how many papers, how loud you talk) it’s about making meaning


And what s dissemination16 l.jpg
And what’s dissemination?

  • So how can we make sure our messages are meaningful to people?

  • In EFFECTS, we tried:

    • Giving out drafts to see what people thought, e.g. at workshop sessions in ALT-C, so other people’s voices were represented too

    • Running workshops so that the ideas could be discussed, not just presented

    • Working with people (“partner sites”) so that experiences could be shared and jointly interpreted

    • Building relationships with people so that they learnt how to interpret the kinds of things we offered them


And what s dissemination17 l.jpg
And what’s dissemination?

  • We also need to think about what ‘counts’ as dissemination

    • Publishing a journal paper?

    • Giving a workshop?

    • Producing a leaflet?

    • A project team meeting?

    • Chatting to a colleague over coffee?

    • Moaning to your partner about work?

  • Formal project evaluation favours ‘obvious’ forms, but studies of change in organisation suggest that ‘invisible’ forms might be more effective

    • Whether they’re more powerful or not, they’re different


And what s dissemination18 l.jpg
And what’s dissemination?

  • A rhetorical question: what method of dissemination would you find most meaningful:

    • A paper about evaluating EFFECTS?

    • A presentation about evaluating EFFECTS?

    • Chatting with people who evaluated EFFECTS about their experience?

    • Being asked to take ideas from EFFECTS, relate them to what you do and compare this with others?

  • We’re offering all of these because different people might respond differently to each


Back to you again l.jpg
Back to you - again

  • Spend a short while thinking:

    • Has this made you think of any other experiences of dissemination?

    • Do these ideas help you to make sense of your experience of dissemination at all?

  • And spend a short while talking:

    • With a convenient group of people, compare your experiences

    • Why were these good or bad? (What feature or quality made it a good or bad experience?)

  • We’ll gather some suggestions from you after a few minutes


Drawing this first part together l.jpg
Drawing this first part together

  • Developing some principles

    • We’ve collected qualities of good and bad evaluation, and of good and bad dissemination

    • How can we use this understanding to guide what we do?

  • You volunteer some principles for evaluation and then for dissemination and we’ll record them


A tool for planning l.jpg
A tool for planning

  • The Evaluation Toolkit – an online, step-by-step guide to help people plan evaluation & dissemination activities

  • Provides ‘layered’ guidance on the steps, process, associated resources & issues for each stage

  • Consists of three stages: planning, advising and presenting

  • Available online at http://www.ltss.bris.ac.uk/jcalt/


Evaluation planner l.jpg
Evaluation Planner

  • Five steps:

    • What are you evaluating?

    • Reasons

    • Context

    • Who is it for

    • Devising the question


Working through this l.jpg
Working through this

  • What are you evaluating?

    • Different kinds of evaluation, e.g. of a web site, a project, a strategy, a teaching innovation

  • Reasons – why are you evaluating this?

    • Validation, monitoring, research, justification, improving, selecting, to provide evidence

  • Context

    • Scope and constraints of your evaluation

  • Who is it for?

    • Identifying key stakeholders, their needs and interests

    • Students, managers, funders, colleagues


Working through this24 l.jpg
Working through this

  • Devising the question

    • Using the previous steps, brainstorm different ways of formulating questions

    • Try to devise a range of question types (e.g. comparisons, contrasts, explorations, quantities, negatives)

  • Keep it simple

    • Focus on key stakeholders and key questions

    • Easy to get out of hand – no more than 3 stakeholders recommended!


Evaluation adviser l.jpg
Evaluation Adviser

  • Two (big) steps:

    • Data capture

      • Choosing the methods to use

      • Describing how you’re going to use this in practice (when, what with, under what constraints)

    • Data analysis

      • Choosing the methods to use

      • Describing how you’re going to use this in practice (when, what with, under what constraints)


Working through this26 l.jpg
Working through this

  • Data capture methods

    • Mapping your evaluation questions to appropriate methods

    • Take account of your own level of expertise and available time

    • Be aware of what each approach was designed to do (don’t use stats on a group of four people or try to interview 200…)

    • Use a variety of methods to build a coherent picture (triangulation)


Working through this27 l.jpg
Working through this

  • Five common methods

    • Focus groups

      • A quick way of getting a range of views/ideas, good for exploration

      • Not necessarily representative; can go off-topic, and individuals can dominate them

    • Interviews

      • A way to understand people’s experiences of things; provides in-depth picture of individual views

      • Time consuming


Working through this28 l.jpg
Working through this

  • Surveys

    • Good broad overview of issues

    • Can be time consuming to analyse; need to be careful when devising questions

  • Usage logs

    • Readily available

    • Need to be careful when interpreting what these mean

  • Experiments

    • Good to compare two things

    • Difficult to do controlled studies in educational settings; can raise ethical issues


Working through this29 l.jpg
Working through this

  • Data analysis

    • Need to map methods to types of data

    • Be aware of your expertise and time

    • As before, be aware of what each approach can and can’t do


Working through this30 l.jpg
Working through this

  • Four common examples

    • Grounded theory

      • Doesn’t pre-suppose particular outcomes

      • Takes ages to do well, requires iterative data collection

    • Statistical analysis

      • Can use standard methods to analyse things quickly

      • You need to know what you’re doing

    • Narrative case study

      • Gives a rich, contextual picture

      • Isn’t generalisable

    • Pre-determined list of categories

      • Builds on previous research

      • May not map to this particular situation


Evaluation presenter l.jpg
Evaluation Presenter

  • Two steps:

    • Closing the loop (reflecting on the process)

    • Presentation tools

      • Selecting the tools to use

      • Describing how you’re going to use these in practice


Working through this32 l.jpg
Working through this

  • Six common presentation tools:

    • Journal article

      • Academic credibility

      • Long lead-time, might only reach a narrow group of people

    • Newsletters

      • Quick

      • Disposable

    • Email lists

      • Quicker & targeted to particular groups

      • May not be read


Working through this33 l.jpg
Working through this

  • Committee reports

    • Specific stakeholders; can be used for political mileage

    • Counter-politics

  • Verbal presentation

    • Quick and easy to do, targeted to particular audiences (responsive)

    • Transient

  • Workshops

    • Allows you to work through issues in detail

    • Time consuming, reaching only small groups


Planning a study l.jpg
Planning a study

  • Organise yourselves into small groups

    • Decide whose study to focus on

    • Look through the summary plan to see how it’s been described

    • Work through the steps of the toolkit, making notes about how your own study might look (about 15 minutes)

    • Choose whether to try and work through the whole plan or whether to spend most of the time discussing particular sections

    • (You’ll need to come back to this at the end of the session)

    • Two or three groups to volunteer to describe interesting features of their plans


Thinking strategically about evaluation l.jpg
Thinking strategically about evaluation

  • Think through individually

    • Key barriers and enablers – individual, departmental, institutional, and external

    • Who are the key people and committees to target - Think of key people (internal and external), committees, etc., where power lies

  • Share experiences – in pairs and with group


Thinking strategically about evaluation36 l.jpg
Thinking strategically about evaluation

  • Example drivers:

    • Quality audit, institutional audit, learning & teaching strategies, operational plans, new appointments of key people, external drivers (e.g. funding)

  • Current examples,

    • new academy (with Liz Beaty in place)

    • Learning and teaching strategies

    • Using research initiatives as a Trojan horse

    • Beware different strategies might work at different times; also some will work in some institutions and not others!


Thinking strategically about dissemination l.jpg
Thinking strategically about dissemination

  • Think about different ways of disseminating

    • What formats to use

    • Who to target

  • Now think about when it would be most effective to do dissemination

    • Draw up relevant lifecycles (e.g. academic & other internal lifecycles, relevant external events)

    • Consider how these can be targeted and used

  • Make a personal list for the study you’ve got in mind

  • Share your timetable in pairs, then with the group


Thinking strategically about dissemination38 l.jpg
Thinking strategically about dissemination

  • Some things to consider:

    • These might be the same stakeholders as for evaluation, but they might not be!

    • Critical times of the year: when to and when not to disseminate

    • Start of the year, exams, as part of other development activities/events

    • Think ahead of time, and work in things such as using external speakers

    • Not just about presenting: e.g. input into strategic plans, operational plans, etc.

    • Indirect dissemination through others can be very effective!


Thinking personally about evaluation l.jpg
Thinking personally about evaluation

  • Leaving aside what evaluation can do for your project…

  • …what can evaluation do for you?

  • At a personal level

    • What do you hope to learn?

    • What might you gain?

    • Who do you hope to persuade?

    • What problems might you cause?

  • Spend five minutes writing down a list – this is one you don’t have to share!


Thinking personally about evaluation40 l.jpg
Thinking personally about evaluation

  • Evaluation is (should be) a learning experience

  • It’s a chance for you to make connections with people

  • It’s a chance to associate with (or criticise – constructively!) a project

  • It can be a chance to build goodwill by giving good advice or helping solve problems for the project team

  • Reporting findings gets your name in front of funders/policy people/managers/committees

  • You might be able to publish something based on the study


Thinking personally about evaluation41 l.jpg
Thinking personally about evaluation

  • These aren’t things that are often talked about

  • If you’re an evaluator, you have power and opportunity – so be honest about it!

  • The potential problem: being professional, and conflicts of interest

    • Would any of these personal aspirations prevent you from doing your role well?

    • Would any affect timeliness, usefulness, how informative the evaluation was, etc.?

    • Which can you pursue, and which might you have to give up in order to do the best job for your clients?


Thinking personally about evaluation42 l.jpg
Thinking personally about evaluation

  • So, back to your personal lists:

    • Think creatively about what your evaluation might enable you to do. Are there other people this could help you meet or influence, for example?

    • Think about the tensions between your personal aspirations and what you might call your professional duty – where might conflicts arise?

    • Revise your list of personal aspirations in light of these exercises

    • Are there any examples of things that haven’t been mentioned you’d be willing to share? (Call them out!)


Being timely l.jpg
Being timely

  • You’ve established aims for your project and for yourself

  • You’ve thought about how you’re going to gather the data

  • You’ve thought about who wants to know what

  • So when’s all this going to happen?


Being timely44 l.jpg
Being timely

  • Evaluation is time consuming

    • If you’re lucky, you’ll have a useful plan by the end of today. Some plans take days of discussion time – particularly those that are politically sensitive.

    • Gathering and analysing data takes longer than you think (e.g. 1 hr of interview taking 4 hrs transcription before analysis starts)

    • Writing can be time consuming, especially in teams


Being timely45 l.jpg
Being timely

  • It’s not just the quantity of time, though…

  • Do you need data from students?

    • When will they be able and willing to provide it?

    • Do they disappear just before exams, never to return?

  • Do you need data from staff?

    • Are they busy all term and/or absent all summer?

  • When do you have free time to analyse all this?

    • Can you set aside time as part of your job?

    • Do you need to make time by giving up other things?

  • When does it have to be done by?

    • Which committee will you report to, and when does it meet?


Being timely46 l.jpg
Being timely

  • Are there opportunities or problems on the horizon that might influence what you do?

    • People are often more interested in evaluations just before a quality audit

    • Documents might be useful in gaining ‘points’ for your department if you can provide evidence of furthering strategic priorities or fitting with the learning & teaching strategy

    • You might find a controversial course proposal is helped if you can append an evaluation of potential students’ needs; so can you evaluate these in time?

    • Evidence of success might help in terms of gaining access to funding (internally or externally)


Being timely47 l.jpg
Being timely

  • Think about your study in terms of time; for example:

    • Is the volume of work you’ve planned realistic?

    • Is the timing of work you’ve planned practical?

    • Are there going to be any enforced delays?

    • Are there any important or immovable deadlines?

    • Do you need to catch particular people before they get busy/go on leave/leave, or want to wait until someone new is in post?

  • Use this to sketch out a plan for your study


How do i know impact when i see it l.jpg
How do I know impact when I see it?

  • Lots of talk about the process of judging, but so far not much talk about judgement itself

  • What counts as evidence of ‘impact’? (What do we mean by ‘impact’ anyhow?) And when you come to it, what exactly is e-learning, or staff development, or teaching…?

  • We can’t make judgements without making assumptions – so let’s be honest about what we’re assuming


I was proceeding across campus in an orderly fashion when l.jpg
“I was proceeding across campus in an orderly fashion when…”

  • Another task for you to do

  • Imagine you’re a detective, and you have been dispatched to an institution where the awful crime of ‘staff development’ appears to have taken place

  • Your task is to make a case to prove that a particular person or project is responsible for doing this to staff

  • What evidence would you look for? How would you use this to argue guilt?

  • Spend five minutes on your own planning you investigation and case


I was proceeding across campus in an orderly fashion when50 l.jpg
“I was proceeding across campus in an orderly fashion when…”

  • Now get together in convenient groups

  • Spend a few minutes

    • Each of you present your case

    • The job of the listeners is to point out weaknesses in the case(“Have you checked their alibi? What if they were covering up for someone else? Do they really know what they’re saying?”)


The problem with evidence l.jpg
The problem with evidence when…”

  • It’s not always easy to be convincing; e.g.

    • Documenting that things have happened doesn’t tell you why they happened

    • Documenting people’s reasons only gives you their (partial) perspective on a situation

    • Measuring things (e.g. exam performance) tells you nothing about things you might not have measured (e.g. learning)(An aside: think about the rhetoric of models – if they tell you that the world works a certain way, they stop you from looking at things that don’t work that way)


The problem with evidence52 l.jpg
The problem with evidence when…”

  • The importance of triangulation

    • Any kind of evidence (interviews, surveys, etc.) only gives you part of the story

    • Any source of evidence (learners, academics, managers, etc.) only gives you part of the story

    • Comparing and synthesising across partial accounts gives you a fuller (but never full!) story

  • The importance of modest claims

    • “Staff perceived that the workshops changed their lives”

    • “This study demonstrated that re-training staff improved retention. However, it may be that our model is too simplistic, and factors such as the cost of education also had a role to play, even though we could not consider this here.”


Judging things l.jpg
Judging things when…”

  • We’d like to hear:

    • Some examples of convincing cases (and why you thought they were convincing)

    • Some examples of unconvincing cases (and why you thought they were unconvincing)


Drawing it all together l.jpg
Drawing it all together when…”

  • By way of a recap, we’ve covered:

    • Definitions

    • Principles

    • Strategy (people and politics)

    • Personal politics

    • Timeliness

    • Judgement

  • Are there any other topics you want to raise for discussion at this point?


Drawing it all together55 l.jpg
Drawing it all together when…”

  • Time to make use of all this

    • Go back to the study plans you worked on

    • Discuss how you would change this in light of this afternoon’s work


So what have you learnt l.jpg
So, what have you learnt? when…”

  • We’d like you to pause, then share in groups:

    • Any major changes (in terms of focus and approach) in the study you were thinking about

    • Anything you’ve learnt that you didn’t expect

    • Any revelations you’ve had about your personal situation, and how to develop it

  • We’ll then ask groups to share some examples with everyone


The outputs from today l.jpg
The outputs from today when…”

  • What do we think you should have got from this?

    • Formal stuff: papers, overheads

    • Generated stuff: experiences of evaluation, experiences of dissemination, principles for these, a list of personal aspirations

    • Personal stuff: the plans (for projects and for personal aspirations) that you’ve produced

    • Intangible stuff: the contacts over coffee or from group work, the discussions that you’ve had, the concepts you’ve acquired and will take away


The outputs from today58 l.jpg
The outputs from today when…”

  • What evidence do we have (at least in theory!) that you’ve learnt something from this?

    • The things you just told us you’d learnt!

    • Revisions of your lists: evidence you’ve changed your understanding and beliefs

    • Production of outputs: our co-construction of understanding (e.g. of principles of good evaluation)

  • So could we claim that this session has developed staff…?

    • …and on that note we’ll call a halt!


ad