Presentation to: EUROMET workshop TEMPMEKO 2004
This presentation is the property of its rightful owner.
Sponsored Links
1 / 48

Presentation to: EUROMET workshop TEMPMEKO 2004 PowerPoint PPT Presentation


  • 62 Views
  • Uploaded on
  • Presentation posted in: General

Presentation to: EUROMET workshop TEMPMEKO 2004. Uncertainty in measurements and calibrations - an introduction. Stephanie Bell UK National Physical Laboratory. Some facts about measurement uncertainty (“the Four Noble Truths”!). Every measurement is subject to some uncertainty.

Download Presentation

Presentation to: EUROMET workshop TEMPMEKO 2004

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Presentation to euromet workshop tempmeko 2004

Presentation to: EUROMET workshop TEMPMEKO 2004

Uncertainty

in measurements

and calibrations

- an introduction

Stephanie Bell

UK National Physical Laboratory


Presentation to euromet workshop tempmeko 2004

Some facts about measurement uncertainty

(“the Four Noble Truths”!)

  • Every measurement is subject to some uncertainty.

  • A measurement result is incomplete without a statement of the uncertainty.

  • When you know the uncertainty in a measurement, then you can judge its fitness for purpose.

  • Understanding measurement uncertainty is the first step to reducing it


Presentation to euromet workshop tempmeko 2004

Understanding the terminology

uncertainty of measurement

Rigorous definition: [ISO GUM]

parameter associated with the result of a measurement, that characterises the dispersion of the values that could reasonably be attributed to the measurand.

NOTES

1. The parameter may be, for example, a standard deviation, or the half-width of an interval having a stated level of confidence

2. Uncertainty of measurement can comprise many components

3. It is understood that the result of the measurement is the best estimate of the value of the measurand and that all components of uncertainty contribute to the dispersion

Simplified definition:

quantified doubt about the result of a measurement


Presentation to euromet workshop tempmeko 2004

Error versus uncertainty

It is important not to confuse the terms error and uncertainty

Error is the difference between the measured value and the “true value” of the thing being measured

Uncertainty is a quantification of the doubt about the measurement result

In principle errors can be known and corrected

But any error whose value we do not know is a source of uncertainty.


Approach to evaluating uncertainty of measurement

Approach to evaluating uncertainty of measurement

Input – formulation – evaluation

  • Inputs to an uncertainty estimate are list/knowledge of sources of uncertainty in a measurement [model]

  • Formulation of an uncertainty estimate - “defining the uncertainty calculation” (e.g. in the form of a spreadsheet)

  • Evaluation - making a calculation (e.g. using the spreadsheet) to get a value of estimated uncertainty for a particular measurement under particular conditions


Presentation to euromet workshop tempmeko 2004

Eight main steps to evaluating uncertainty (homage to the GUM)

1Decide what you need to find out from your measurements. What actual measurements and calculations are needed to produce the final result. (The Model)

2Carry out the measurements needed.

3Estimate the uncertainty of each input quantity that feeds into the final result. Express all uncertainties in consistent terms.


Presentation to euromet workshop tempmeko 2004

Eight main steps to evaluating uncertainty

4Decide whether errors of the input quantities would be independent of each other. If you think not, then some extra calculations or information are needed (correlation).

5Calculate the result of your measurement (including any known corrections for things such as calibration).

6Find the combined standard uncertainty from all the individual parts.


Presentation to euromet workshop tempmeko 2004

Eight main steps to evaluating uncertainty

7Express the uncertainty in terms of a size of the uncertainty interval, together with a coverage factor, and state a level of confidence.

8Record the measurement result and the uncertainty, and state how you got both of these.


Eight steps to evaluating uncertainty of measurement

“Eight steps” to evaluating uncertainty of measurement

  • Think ahead

  • Measure*

  • Estimate uncertainty components

  • (Consider correlation)

  • Calculate results (inc. known corrections)

  • Find the combined uncertainty

  • Express the result (confidence interval, confidence level coverage factor)

  • Document it


The noble eightfold path to enlightenment

“The Noble Eightfold Path to Enlightenment”

  • Think ahead

  • Measure*

  • Estimate uncertainty components

  • (Consider correlation)

  • Calculate results (inc. known corrections)

  • Find the combined uncertainty

  • Express the result (confidence interval, confidence level coverage factor)

  • Document it


Eight steps to evaluating uncertainty of measurement1

“Eight steps” to evaluating uncertainty of measurement

  • Think ahead

  • Measure*

  • Estimate uncertainty components

  • (Consider correlation)

  • Calculate results (inc. known corrections)

  • Find the combined uncertainty

  • Express the result (confidence interval, confidence level coverage factor)

  • Document it


Presentation to euromet workshop tempmeko 2004

Where do errors and uncertainties come from?

  • The measuring instrument - bias, drift, noise, etc.

  • The condition being measured - which may not be stable.

  • The measurement process - may be problematic

  • (more ….)


Presentation to euromet workshop tempmeko 2004

Where do errors and uncertainties come from?

  • (cont ….)

  • ‘Imported’ uncertainties - e.g. calibration uncertainty

  • (But the uncertainty due to not calibrating is worse.)

  • Operator skill

  • Sampling issues

  • The environment - temperature, air pressure, etc.

  • … and others


Eight steps to evaluating uncertainty of measurement2

“Eight steps” to evaluating uncertainty of measurement

  • Think ahead

  • Measure*

  • Estimate uncertainty components

  • (Consider correlation)

  • Calculate results (inc. known corrections)

  • Find the combined uncertainty

  • Express the result (confidence interval, confidence level coverage factor)

  • Document it


Eight steps to evaluating uncertainty of measurement3

“Eight steps” to evaluating uncertainty of measurement

  • Think ahead

  • Measure*

  • Estimate uncertainty components

  • (Consider correlation)

  • Calculate results (inc. known corrections)

  • Find the combined uncertainty

  • Express the result (confidence interval, confidence level coverage factor)

  • Document it


Presentation to euromet workshop tempmeko 2004

Basic statistics on sets of numbers

From repeated measurements you can:

  • Calculate an average or mean – to get a better estimate of the true value

  • Calculate a standard deviation – to show the spread of the readings. This tells you something about the uncertainty of your result


Presentation to euromet workshop tempmeko 2004

Basic statistics on sets of numbers

Sample

Population


Presentation to euromet workshop tempmeko 2004

Evaluating uncertainty

The two ways to evaluateindividual uncertainty contributions:

  • Type A evaluations - uncertainty estimates using statistics (usually from repeated readings)

  • Type B evaluations - uncertainty estimates from any other information, e.g.from

    • past experience of the measurements

    • from calibration certificates

    • manufacturer’s specifications

    • from calculations

    • from published information

    • and from common sense.

  • Not necessarily “random versus systematic”


Presentation to euromet workshop tempmeko 2004

Blob charts

“Normal” or Gaussian distribution


Presentation to euromet workshop tempmeko 2004

Blob charts

Uniform or rectangular distribution

(NB probability uniform, not data - small set!)


Presentation to euromet workshop tempmeko 2004

Evaluating standard uncertainty

Uncertainties are expressed in terms of equivalent probability*

Uncertainty in a mean value - “standard uncertainty”u

… for a Type A uncertainty evaluation

… for a Type B evaluationof a rectangular distributed uncertainty


Presentation to euromet workshop tempmeko 2004

Combining standard uncertainties

(Summation in quadrature)

This rule applies where the result is found using addition / subtraction.

Other versions of this rule apply

… for multiplication or division

… for more complicated functions

All uncertainties must be in the same units and same level of confidence


Eight steps to evaluating uncertainty of measurement4

“Eight steps” to evaluating uncertainty of measurement

  • Think ahead

  • Measure*

  • Estimate uncertainty components

  • (Consider correlation)

  • Calculate results (inc. known corrections)

  • Find the combined uncertainty

  • Express the result (confidence interval, confidence level coverage factor)

  • Document it


Presentation to euromet workshop tempmeko 2004

Correlation … ?!#$¿Ξ!


Presentation to euromet workshop tempmeko 2004

Correlation

Ref

UUT

Difference (Correction Ref-UUT)


Presentation to euromet workshop tempmeko 2004

Correlation

SRef

Ref

SUUT

UUT

Scorr

Difference (Correction Ref-UUT)


Presentation to euromet workshop tempmeko 2004

Result:

Difference = Ref - UUT

Simple quadrature summation of independent standard deviations would suggest

Sdiff2 =Sref2 +SUUT2

But clearly Sdiff is far smaller than the combination of the two others


Correlation

Correlation

Covariance

Where and are the arithmetic means of the observed values xi and yi, respectively and n is the number of measurements.


Correlation1

Correlation

  • Sometimes we know cause-and-effect relationship (sometimes not)

  • If a positive correlation is ignored – uncertainty tends to be overestimated

  • If a negative correlation is ignored – uncertainty tends to be underestimated

  • But when can we safely ignore correlation?

  • Answer – when the covariance uncertainty is much smaller than component variances ( s2xy << other s2 and “other u”)

  • examples


Eight steps to evaluating uncertainty of measurement5

“Eight steps” to evaluating uncertainty of measurement

  • Think ahead

  • Measure*

  • Estimate uncertainty components

  • (Consider correlation)

  • Calculate results (inc. known corrections)

  • Find the combined uncertainty

  • Express the result (confidence interval, confidence level coverage factor)

  • Document it


Eight steps to evaluating uncertainty of measurement6

“Eight steps” to evaluating uncertainty of measurement

  • Think ahead

  • Measure*

  • Estimate uncertainty componenets

  • (Consider correlation)

  • Calculate results (inc. known corrections)

  • Find the combined uncertainty

  • Express the result (confidence interval, confidence level coverage factor)

  • Document it


Spreadsheet model

Spreadsheet model


Spreadsheet model1

Spreadsheet model

The columns are:

Symbol or reference

Source of uncertainty –

brief text description of each uncertainty

Value (±) – a number which is the estimate of the uncertainty. The estimate comes from whatever information you have, such as “worst case limits” or “standard deviation”. Units should be shown, e.g. °C, or %rh.

Probablility distribution - either rectangular, normal, (or rarely others)


Spreadsheet model2

Spreadsheet model

Divisor - factor to normalise a value to a standard uncertainty - depends on the probability distribution.

ci- sensitivity coefficient

to convert to consistent units, e.g. measurement in volts, effect in terms of relative humidity. (But sometimes ci = 1)

u– standard uncertainty a “standardised measure of uncertainty” calculated from previous columns: u = value ÷ divisorci .

νi– effective number of degrees of freedom – an indication of the reliability of the uncertainty estimate (the uncertainty in the uncertainty!) - sometimes ignored! Sometimes infinte! ()


Spreadsheet model3

Spreadsheet model

The rows are:

Title row

One row for each uncertainty

One row for combined standard uncertainty, uc, by summing “in quadrature” and taking square root, i.e., uc = 

Final row showing expanded uncertainty, U = k  uc. Normally a coverage factor k = 2 is used (giving a level of confidence of 95 percent, as long as the number of degrees of freedom is high). The expanded uncertainty is what finally should be reported.


Spreadsheet model4

Spreadsheet model


Presentation to euromet workshop tempmeko 2004

What else you might need to know

Uncertainty in a mean value “standard uncertainty”u

… for a Type A evaluation

… for a Type B evaluationof a rectangular distribution

Divisors respectively 1 and 1/(23)


Presentation to euromet workshop tempmeko 2004

Other things you should know

Coverage factor k may be a divisor

1σ, 68% confidence, k=1

2σ,95% confidence, k=2


Presentation to euromet workshop tempmeko 2004

Other things you should know … (cont.)

How to express the answer

Write down the measurement result and the uncertainty, and state how you got both of these.

Express the uncertainty in terms of a coverage factor together with a size of the uncertainty interval, and state a level of confidence.


Presentation to euromet workshop tempmeko 2004

You might record:

Example - How long is a piece of string

‘The length of the string was 5.027 m ± 0.013 m. The reported expanded uncertainty is based on a standard uncertainty multiplied by a coverage factor k = 2, providing a level of confidence of approximately 95%.

‘The reported length is the mean of 10 repeated measurements of the string laid horizontally. The result is corrected for the estimated effect of the string not lying completely straight when measured. The uncertainty was estimated according to the method in …[a reference]


Eight steps to evaluating uncertainty of measurement7

“Eight steps” to evaluating uncertainty of measurement

  • Think ahead

  • Measure*

  • Estimate uncertainty components

  • (Consider correlation)

  • Calculate results (inc. known corrections)

  • Find the combined uncertainty

  • Express the result (confidence interval, confidence level coverage factor)

  • Document it


Presentation to euromet workshop tempmeko 2004

Coverage factor and degrees of freedom

  • Work with standard uncertainties and then multiply up to the desired value of k

    k = 1 for a confidence level of approximately 68 %

    k = 2 for a confidence level of approximately 95 %

    k = 3 for a confidence level of approximately 99.7 %

  • These approximations hold true if your combined uncertainty has “many degrees of freedom” (is based on many measurements, has many sources)


Effective number of degrees of freedom

Effective number of degrees of freedom

  • We said sample standard deviations is only an estimate ofpopulation standard deviationσ

  • We make an allowance for the unreliability of statistics on small samples

    • (weighting with Student’s t factor)

    • using Welch-Satterthwaite equation to combine

  • Effective number of degrees of freedom eff of a standard deviation of n data is n-1

  • Effective number of degrees of freedom eff of a worst-case estimate of limits of uncertainty is infinite () … less intuitive


Effective number of degrees of freedom1

Effective number of degrees of freedom

  • Welch-Satterthwaite


Uncertainty in the uncertainty

Uncertainty in the uncertainty!

  • Interestingly, eff can tell us the uncertainty in the uncertainty from

    u(u)  2(eff) -1/2 [ISO GUM]

  • For eff= 500, u(u)  9% of uncertainty estimate

  • For eff= 100, u(u)  20% of uncertainty estimate

  • “Traditionally” we aim for eff> 30 (why)

  • For eff= 30, u(u)  37% of uncertainty estimate


Presentation to euromet workshop tempmeko 2004

What is an “uncertainty budget” for?

  • To find out what is the uncertainty in your measurement or process

  • To demonstrate that you have considered it

  • To identify the most important (largest) uncertainties so you can work to reduce them

  • To “operate within your budget” (it is actually an “error budget” …. cf “uncertainty analysis”)


Presentation to euromet workshop tempmeko 2004

What is not a measurement uncertainty

Mistakes are not uncertainties

Tolerances are not uncertainties

Accuracy (or rather inaccuracy) is not the same as uncertainty

Statistical analysis is not the same as uncertainty analysis


A buddhist quote

A Buddhist quote …

“To follow the Noble Eightfold Path is a matter of practice rather than intellectual knowledge, but to apply the path correctly it has to be properly understood”


  • Login