1 / 36

Chapter 4

Chapter 4. Process Measurement. Process Metrics. Measurement – the act of quantifying the performance dimensions of products, services, processes, and other business activities. Measures and indicators - numerical information that results from measurement Defects/unit Errors/opportunity

Download Presentation

Chapter 4

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 4 Process Measurement

  2. Process Metrics • Measurement – the act of quantifying the performance dimensions of products, services, processes, and other business activities. • Measures and indicators - numerical information that results from measurement • Defects/unit • Errors/opportunity • dpmo

  3. Types of Metrics • Discrete metric – something that is countable • Continuous metric – something concerned with the degree of conformance to specifications

  4. Effective Metrics • SMART • simple, • measurable, • actionable (they provide a basis for decision-making), • related (to customer requirements and to each other), and • timely.

  5. Identifying and Selecting Process Metrics • Identify all customers and their requirements and expectations • Define work processes • Define value-adding activities and process outputs • Develop measures for each key process • Evaluate measures for their usefulness

  6. Data Collection • Key Questions • What questions are we trying to answer? • What type of data will we need to answer the question? • Where can we find the data? • Who can provide the data? • How can we collect the data with minimum effort and with minimum chance of error?

  7. Operational Definition • Clear and unambiguous definition of a metric, e.g.: • On time delivery • Error

  8. Process Capability • The range over which the natural variation of a process occurs as determined by the system of common causes • Measured by the proportion of output that can be produced within design specifications

  9. Process Capability Study • Choose a representative machine or process • Define the process conditions • Select a representative operator • Provide the right materials • Specify the gauging or measurement method • Record the measurements • Construct a histogram and compute descriptive statistics: mean and standard deviation • Compare results with specified tolerances

  10. (a) (b) specification specification natural variation natural variation (c) (d) specification specification natural variation natural variation Process Capability

  11. Process Capability Index The process capability index, Cp (sometimes called the process potential index), is defined as the ratio of the specification width to the natural tolerance of the process. Cp relates the natural variation of the process with the design specifications in a single, quantitative measure.

  12. Calculating Process Capability Indexes UTL - LTL 6s Cp = UTL - m 3s Cpu = m - LTL 3s Cpl = Cpk = min{ Cpl, Cpu }

  13. Types of Capability Studies • Peak performance study- how a process performs under ideal conditions • Process characterization study- how a process performs under actual operating conditions • Component variability study- relative contribution of different sources of variation (e.g., process factors, measurement system)

  14. Spreadsheet Template

  15. Dashboards and Scorecards • Dashboard – collection of key operational measures • Graphs, charts, visual aids • Daily information for management and control • Balanced Scorecard – summary of broad performance measures across the organization • Strategic guidance

  16. Check Sheets Check sheetsare special types of data collection forms in which the results may be interpreted on the form directly without additional processing.

  17. Check Sheet • Creates easy-to-understand data • Builds, with each observation, a clearer picture of the facts • Forces agreement on the definition of each condition or event of interest • Makes patterns in the data become obvious quickly xx xxxxxx x

  18. Sampling • What is the objective of the study? • What type of sample should be used? • What possible error might result from sampling? • What will the study cost?

  19. Sampling Methods • Simple random sampling • Stratified sampling • Systematic sampling • Cluster sampling • Judgment sampling

  20. Selecting a Sampling Plan A good sampling plan should select a sample at the lowest cost that will provide the best possible representation of the population, consistent with the objectives of precision and reliability that have been determined for the study.

  21. Sampling Error • Sampling error (statistical error) • Nonsampling error (systematic error) • Factors to consider: • Sample size • Appropriate sample design

  22. Data Classification • Type of data • Cross-sectional—data that are collected over a single period of time • Time series—data collected over time • Number of variables • Univariate—data consisting of a single variable • Multivariate—data consisting of two or more (often related) variables

  23. Sample Statistics

  24. Excel Tools for Descriptive Statistics • Tools…Data Analysis… Descriptive Statistics • Tools…Data Analysis…Histogram

  25. Measurement System Evaluation • Whenever variation is observed in measurements, some portion is due to measurement system error. Some errors are systematic (called bias); others are random. The size of the errors relative to the measurement value can significantly affect the quality of the data and resulting decisions.

  26. Metrology - Science of Measurement • Accuracy - closeness of agreement between an observed value and a standard • Precision - closeness of agreement between randomly selected individual measurements

  27. Repeatability and Reproducibility • Repeatability (equipment variation)– variation in multiple measurements by an individual using the same instrument. • Reproducibility (operator variation) - variation in the same measuring instrument used by different individuals

  28. Repeatability & Reproducibility Studies • Quantify and evaluate the capability of a measurement system • Select m operators and n parts • Calibrate the measuring instrument • Randomly measure each part by each operator for r trials • Compute key statistics to quantify repeatability and reproducibility

  29. Spreadsheet Template

  30. R&R Evaluation • Under 10% error - OK • 10-30% error - may be OK • over 30% error - unacceptable

  31. Calibration One of the most important functions of metrology is calibration—the comparison of a measurement device or system having a known relation-ship to national standards against another device or system whose relationship to national standards is unknown.

  32. Benchmarking • Benchmarking – “the search of industry best practices that lead to superior performance.” • Best practices – approaches that produce exceptional results, are usually innovative in terms of the use of technology or human resources, and are recognized by customers or industry experts.

  33. Types of Benchmarking • Competitive benchmarking - studying products, processes, or business performance of competitors in the same industry to compare pricing, technical quality, features, and other quality or performance characteristics of products and services. • Process benchmarking – focus on key work processes • Strategic benchmarking – focus on how companies compete and strategies that lead to competitive advantage

  34. Project Review – Measure (1 of 2) • Team members have received any necessary “just-in-time” training • Key metrics for all CTQ characteristics have been defined • The team has determined what aspects of the problem need to be measured, including both process and results measures • Operational definitions of all measurements have been developed • All appropriate sources of data have been investigated, and a data collection plan established before data is collected

  35. Project Review – Measure (2 of 2) • Data collection forms have been tested and validated. • Sample sizes required for statistical precision have been identified. • Data have been collected in an appropriate fashion, according to plan • The data are accurate and reliable • Measurement systems have been evaluated using R&R studies or other appropriate tools • Process capability has been addressed as appropriate • Benchmarks and best practice information has been collected

More Related