1 / 54

MEASUREMENT

MEASUREMENT. Measurement. “If you can’t measure it, you can’t manage it.” Bob Donath, Consultant. Measurement. Selecting measurable phenomena. Developing a set of mapping rules. Applying the mapping rule to each phenomenon.

nile
Download Presentation

MEASUREMENT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MEASUREMENT

  2. Measurement “If you can’t measure it, you can’t manage it.” Bob Donath, Consultant

  3. Measurement Selecting measurable phenomena Developing a set of mapping rules Applying the mapping rule to each phenomenon

  4. What is to be measured?CONCEPTS/CONSTRUCTS/VARIABLES • Operational Definitions - Dimensions - Elements • Scales

  5. CONCEPT A GENERALIZED IDEA ABOUT A CLASS OF OBJECTS, ATTRIBUTES, OCCURRENCES, OR PROCESSES. (e.g., Technology, Dynamism, Adoption, Learning)

  6. CONSTRUCT • An image or idea specifically invented for a given research and/or theory-building purpose • Higher level concepts for specialised scientific explanatory purposes that are not directly observable and for thinking about and communicating abstractions Concepts and constructs are used at theoretical levels

  7. VARIABLE • Used at empirical level • Accept numeral or values for the purpose of testing and measurement

  8. OPERATIONAL DEFINITION A definition for a construct stated in terms of specific criteria for testing or measurement Specifies what the researcher must do to measure the concept/construct under investigation

  9. SCALE- Definition • SERIES OF ITEMS • ARRANGED ACCORDING TO VALUE • FOR THE PURPOSE OF QUANTIFICATION • A CONTINUOUS SPECTRUM

  10. SCALE PROPERTIES • UNIQUELY CLASSIFIES • PRESERVES ORDER • DISTANCE/EQUAL INTERVALS • NATURAL ORIGIN (ZERO)

  11. Types of Scales Nominal Ordinal Interval Ratio

  12. NOMINAL SCALE

  13. Nominal Scales • Mutually exclusive and collectively exhaustive categories • Exhibits the classification characteristic only

  14. NOMINAL SCALE PROPERTIES UNIQUELY CLASSIFIES • Male/Female • Academic/Admin • Asian/European • Strategy types • Adopters/Non Adopters

  15. Levels of Measurement Nominal Classification Ordinal Interval Ratio

  16. ORDINAL SCALE

  17. Ordinal Scales • Characteristics of nominal scale plus an indication of order • Implies statement of greater than and less than

  18. ORDINAL SCALE PROPERTIES UNIQUELY CLASSIFIES PRESERVES ORDER • Win, Place & Show • Podium/Grid positions • Ranking

  19. Levels of Measurement Nominal Classification Ordinal Classification Order Interval Ratio

  20. INTERVAL SCALE

  21. Interval Scales • Characteristics of nominal and ordinal scales plus the concept of equality of interval. • Equal distance exists between numbers

  22. INTERVAL SCALE PROPERTIES UNIQUELY CLASSIFIES PRESERVES ORDER EQUAL INTERVALS • Consumer Price Index (Base 100) • Fahrenheit Temperature • Assessment of attitude, beliefs, intention

  23. Levels of Measurement Nominal Classification Ordinal Classification Order interval Classification Distance Order Ratio

  24. RATIO SCALE

  25. Ratio Scales • Characteristics of previous scales plus an absolute zero point • Examples • Weight • Height • Number of children

  26. RATIO SCALE PROPERTIES UNIQUELY CLASSIFIES PRESERVES ORDER EQUAL INTERVALS NATURAL ZERO • Weight and distance • Age, years of service

  27. Levels of Measurement Nominal Classification Ordinal Classification Order Interval Classification Distance Order Ratio Classification Distance Order Natural Origin

  28. Goodness of Measures • Accurately measuring what actually the concept one sets out to measure • Should be easy and efficient to use

  29. Validity Reliability Practicality Evaluating Measurement Tools Criteria

  30. Validity The ability of a scale to measure what was intended to be measured.

  31. Validity Determinants Content Criterion Construct

  32. Increasing Content Validity Content Question Database Literature Search Expert Interviews Group Interviews

  33. ……Goodness of Measures • Validity • Content Validity • Includes Face Validity • How well the dimensions and elements of a concept have been delineated • Professional agreement that a scale logically appears to accurately measure what it is intended to measure

  34. Validity Determinants Content Construct

  35. Increasing Construct Validity New measure of trust Known measure of trust Empathy Credibility

  36. ……Goodness of Measures • Construct Validity – when empirical evidence generated by a measure is consistent with the theoretical logic of the concept • Convergent Validity – the measure should “converge” with other similar measures • Discriminant Validity – when the measure has low correlation with measures of dissimilar concept

  37. Judging Criterion Validity Relevance Criterion Freedom from bias Reliability Availability

  38. Criterion-Related Validity • Concurrent – When the scale discriminates individuals who are known to be different; A type of criterion validity whereby a new measure correlates with a criterion measure taken at the same time. • Predictive Validity – Ability to differentiate among individuals with reference to a future criterion

  39. VALIDITY

  40. Reliability The degree to which measures are free from random error and therefore yield consistent results.

  41. Reliability Estimates Stability Internal Consistency Equivalence

  42. Reliability Estimates Stability Internal Consistency Equivalence

  43. … Reliability • Stability of Measures • Test-Retest Reliability – the administering of the same scale or measure to the same respondents at two separate points in time • Parallel-Form Reliability - when responses on two comparable sets of measures tapping the same construct are highly correlated

  44. Reliability Estimates Stability Internal Consistency Equivalence

  45. Equivalence • Variations at one point in time among observers and samples of items

  46. Reliability Estimates Stability Internal Consistency Equivalence

  47. … Reliability • Internal Consistency Measures • Interitem Consistency Reliability - where items that are independent measures of the same concept correlate with one another. • Split-Half Reliability – reflects the correlations between two halves of an instrument

  48. RELIABILITY

  49. Understanding Validity and Reliability

  50. Practicality Economy Interpretability Convenience

More Related