1 / 29

ELECTRONIC INSTRUMENTATION EKT 314/4 Chapter 1 Introduction to EI Zahari Awang Ahmad - PowerPoint PPT Presentation

ELECTRONIC INSTRUMENTATION EKT 314/4 Chapter 1 Introduction to EI Zahari Awang Ahmad. Introduction to Electronic Instrumentation. In this Chapter, we will cover the introduction to Electronic Instrumentation which include the following :. Instrumentation Element Application Field

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

PowerPoint Slideshow about ' ELECTRONIC INSTRUMENTATION EKT 314/4 Chapter 1 Introduction to EI Zahari Awang Ahmad' - halla-hobbs

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

EKT 314/4

Chapter 1

Introduction to EI

In this Chapter, we will cover the introduction to Electronic Instrumentation which include the following :

• Instrumentation Element

• Application Field

• Review

• Definition

• Measurement

• Analysis

• Direct Analysis.

• Statistical Analysis.

Instrumentation is the branch of engineering that deals with measurement and control.

Instrumentation is defined as the art and science of measurement and control.[1]

It serves not only sciences but all branches of engineering, medicine, and almost every human endeavor.

Electronics Instrumentation is the application of measurement technology in electronic-related field.

[1] http://en.wikipedia.org/wiki/Instrumentation

Instrument

Measurement

Accuracy

A device or mechanism used to determine the present value of the quantity under measurement.

The process of determining the amount,

degree, or capacity by comparison (direct or indirect) with

the accepted standards of the system units being used.

The degree of exactness (closeness) of a

measurement compared to the expected (desired) value

The smallest change in a measured variable to which an instrument will respond.

Precision A measure of the consistency or repeatability of

measurements, i.e. successive reading do not differ. (Precision

is the consistency of the instrument output for a given value of input).

The design value, i.e. the most probable value that calculations indicate one should expect to measure.

Resolution

Precision

Expected value

Error

Sensitivity

The deviation of the true value from the desired value.

The ratio of the change in output (response) of the instrument to a change of input or measured variable.

 The process of comparing an unknown quantity with an accepted standard quantity.

 The process of determining the amount, degree, or capacity

by comparison (direct or indirect) with the accepted

standards of the system units being used.

Vector representing a change in position of a body or a point with respect to a reference.

Relative deformation of elastic, plastic, and fluid materials under applied forces.

Oscillatory motion which can be described in term of amplitude (size), frequency (rate of oscillation) and phase (timing of the oscillation relative to fixed time)

Displacement

Strain

Vibration

Pressure

Flow

Temperature

Force

Ratio of force commonly acting on a surface to the area of the surface.

Stream of molten or liquidified material that can be measured in term of speed and quantity

Measure of relative warmth or coolness of an object compared to absolute value.

Defined as a quantity that changes the motion, size, or shape of a body.

Torque

Defined as the tendency of a force to rotate the body to which it is applied.

 International System of Units (abbreviated SI from the French le Système international d'unités)

 It is the world's most widely used system of measurement, both in everyday commerce and in science.

 The SI was developed in 1960 from the old metre-kilogram-

second system.

Length

Mass

Time

Electric current

Temperature

Luminous intensity

Amount of substance

Meter (m)

Kilogram (kg)

Second (s)

Ampere (A)

Kelvin (K)

Candela (cd)

Mole (mol)

 Electric charge – coulomb (C)

 Electric potential difference – volt (V)

 Electric resistance – ohm (Ω)

 Electric capacitance – farad (F)

 Electric inductance – henry (H)

 Energy – joule (J)

 Force – newton (N)

 Magnetic flux – weber (Wb)

 Power – watt (W)

• Error is the degree to which a measurement nears the expected value. It can be expressed as:

•  Absolute error

•  Percentage of error

• Percentage of error

• Accuracy can be calculated based on error.