1 / 62

Putting Statistics into Practice - Strategies for effective management

Putting Statistics into Practice - Strategies for effective management. J Eric Davies & Claire Creaser. Options for measuring & managing. J Eric Davies. Outline. Mission, vision, aims objectives Range of data types Applications of data Methods of acquiring data General principles.

xuxa
Download Presentation

Putting Statistics into Practice - Strategies for effective management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Putting Statistics into Practice -Strategies for effective management J Eric Davies & Claire Creaser

  2. Options for measuring & managing J Eric Davies

  3. Outline • Mission, vision, aims objectives • Range of data types • Applications of data • Methods of acquiring data • General principles

  4. Measuring with meaning • Every organisation, no matter what its mission or scope needs three kinds of performance metrics - • to measure its success in mobilizing its resources, • its staff’s effectiveness on the job, and • its progress in fulfilling its mission. • McKinsey Quarterly. 2001 -2

  5. Mission with meaning STRATEGIC FOCUS • WHAT - • Organisation? NOW / LATER • Service? NOW / LATER • Direction? NOW / LATER • Mission / Vision • aims / objectives

  6. Criteria AIMS/OBJECTIVES Specific Measurable Acceptable Realistic Time-bound {SMART} Consistent Unambiguous Testing Empowering {CUTE}

  7. How (im)possible is your mission? MISSION • Statement of purpose and functions – why service exists, what it does, who it serves VISION • Statement of desired future state – where service wants to be

  8. How (im)possible is your mission? MISSION/VISION STATEMENTS ~ • MEANING • CREDIBILITY • ACCEPTABILITY • TESTABILITY

  9. How (im)possible is your mission? MISSION/VISION STATEMENTS ~ Strathclyde University – Glasgow • A place of useful learning[1726] • The place of useful learning [2000]

  10. Here’s one I prepared earlier! MISSION/VISION STATEMENTS ~ • Public Library - The mission of the Library is to serve as a cathedral of human knowledge— an accessible database of knowledge that serves as the community's memory—and as an information and knowledge safety net, while providing materials, programs, and services to the people of the community.

  11. Digging deep for evidence What Kind Of Evidence [Information]? • Statistics and Performance Indicators • [Quantitative + Qualitative ] • Social Measures • [Soft Indicators] ‘DISTANCE TRAVELLED’

  12. Who cares {or should do}? AUDIENCE: [stakeholders] • Funders • Managers/Staff • Users • Community • Vendors • Global

  13. All kinds of measuring • Inputs SERVICE DOMAIN • Outputs • Outcomes USER RESPONSES • Impacts s e r v i c e

  14. Social dimensions Examples of ‘Soft’ Indicators:- • Attitudinal • Personal • Practical • Key Work Skills DISTANCE TRAVELLED

  15. Social dimensions • Personal development - individual self-confidence, self awareness, creativity, new skills and abilities. • Social cohesion - Impact on group/community identity • Community involvement and empowerment - • Health - people feeling better, happier etc.

  16. How does the evidence add up? APPLICATIONS ~ SERVICES & PROJECTS • Policies • Strategies • Tactics • Processes and • Operations • Advocacy

  17. What does the evidence answer? APPLICATIONS ~ SERVICES • How have we done? • How are we doing now? • How can we do better? • Where are we going? • How do we get there? • How are we making a difference? • How do we get the resources

  18. What does the evidence answer? APPLICATIONS ~ PROJECTS • Did we achieve what we were seeking to achieve? • Did we do what we said we would do? • How did we do it? • What did we use? • What did we get out? • What worked and what didn’t work? • What could we do differently? • What can we apply continuously? • What difference did it make that we did it? • Who benefited?

  19. Managing and measuring Framework for Performance Measurement:- • integration • user satisfaction • effectiveness (delivery) • efficiency • economy Follett Report – academic libraries

  20. Managing and measuring Three E’s • Economy in acquisition of resources • Efficiency in the use of resources • Effectiveness in the achievement of objectives UK Treasury [1980’s] FMI Sizer [1980’s]

  21. Comparing and changing • BENCHMARKING • Motorola + D.E.C. + Xerox • To make changes that lead to quantum and continuous improvements in products, processes and services that result in total customer satisfaction and competitive advantage

  22. Comparing and changing BENCHMARKING – • Evaluate the level of performance of various services within an institution • Overall level of institution performance • Compare against published standards • Compare performance over time • Compare with other institutions

  23. Finding out Gathering Evidence - • What do you need to know? • Where is the information? • Who has the information? • How will you get it? • How accurate is it / do you need it to be? • How will you interpret it? • How will you act on it? • How will you present it?

  24. Gathering evidence Techniques/Tools/Options for Gathering Data:- • MIS / Transaction Logs • Databases / Publications • Surveys : questionnaire,telephone, interview • Focus Groups / Graffiti boards • Observation / Diaries / Logs • Press~Media Coverage

  25. Gathering evidence TOOLS:- • What Outcomes, Dimensions, Performance to be measured? • reliable + valid • meaningful and precise

  26. Gathering evidence Options for Gathering Data - • … if the only tool you have is a hammer, everything starts looking like a nail. F.W. Huibregston - Partner: McKinsey’s

  27. Changing times; changing evidence UPDATING EVIDENCE • Service Evolution • New/Discontinued services - methods - technologies - clients • Diminishing Variance • Improvement - Gaming - Deception

  28. Manager beware!! OVERDOING IT: If you know everything, you know nothing George Johnson: Fire in the Mind. [1996] … a world that never measures or counts is really beyond our control. The trouble is that we’re in danger of doing little else. David Boyle: RSA Lecture . [2001]

  29. How much evidence? ... data is not information. Information is data endowed with relevance and purpose. A company must decide what information it needs to operate its affairs, otherwise it will drown in data Peter Drucker - Managing for the Future.

  30. Making sense of measuring Sumsion’s Law of Statistical Dullness ~ In comparative statistics the great majority of results are inherently close to the average and consequently dull. {Sumsion} LIRN 2001 (79) p.3.

  31. Evidence for yesterday Statistics, being essentially historical, can only provide information after the event. {Sumsion} Int. Encyclopedia of Lib. and Info. Sci.(1997)p.432

  32. Measuring and managing Information is a precondition for identifying choices, reducing uncertainty about their implications and facilitating their implementation. Center for Transnational Corporations:- CTC Reporter 14 Winter 1983 p.34 -

  33. Managing and measuring; comparing and changing • LISU: We’ve got the measure of information! • A skilled team of experienced Managers, Statisticians and Administrators all adding value to statistical data and providing authoritative and reliable information to support managers in culture, information and related environments.

  34. Library and Information Statistics Unit lisu@lboro.ac.uk

  35. Mission possibilities • Does it have • Meaning? • Does it actually mean anything? • Credibility • Do you believe it can be achieved? • Acceptability • Will all the stakeholders (funders, staff, users) ‘buy-in’ to this mission? • Testability • How would you demonstrate you are achieving your mission?

  36. Library and Information Statistics Unit lisu@lboro.ac.uk

  37. Statistics for the faint hearted Claire Creaser CStat

  38. Introduction to statistics • Basics • What are statistics • Useful techniques • Sampling • Surveys and sample sizes • Questionnaire design • Analysis • Benchmarking • Presentation of results

  39. What are statistics? • Numbers with context • 1,300 items issued last month • The average price paid for a CD is £12.50 • 25% of staff time is spent re-shelving books • Women borrow twice as many books on average as men • Serials cost three times as much as books • The average spend per user has increased less than general inflation over the last ten years

  40. Where to start • What do you want to know? • Evidence of good management • Value for money • Advocacy • What data to collect? • What do you want to know? • Relevant • Useful • Current

  41. Where do they come from? • Library management systems • Stock statistics, financial data, staff . . . . • Regular surveys • User opinions, condition of stock . . . . • Occasional surveys • Project evaluation

  42. What do they look like? • Categorical • Gender; classmark; membership status • Ordinal • Stock condition; satisfaction ratings • Ratio or interval • Acquisitions; issues; expenditure

  43. What can you do with them? • What do you want to know? • Descriptive statistics • Mean, range, distributions, proportions • graphical presentations • Inference from samples • Estimates, error levels • Advanced techniques • Correlation, regression, analysis of variance

  44. Choosing the right technique • Keep it simple! • Categorical data • Proportions in each category • Comparisons • Ordinal data • Proportions in each category • Medians • Ratio data • Means

  45. Sample surveys • Why sample? • Cost • Practicalities • Where to start? • Sampling frame • Sample design • Sample size

  46. Types of sample • Simple random • Systematic • Stratification and clustering • Quota samples • Self-selected

  47. How many? • Less than you think! • Depends on: • Level of detail • Desired margin of error • Expected response rate • Does not depend on population size • Unless small population • 400 will give accuracy of ± 5% • 1,000 for ± 3% • 2,500 for ± 2%

  48. Questionnaire design • Self-completion or interview? • Clear, unambiguous questions • Clear, easy to follow layout • As short as possible • Number of questions • Number of pages • Tick boxes or short answers • Data entry issues

  49. Sampling times • One period, or several? • Periodicity

More Related