Loading in 2 Seconds...

ELECTRONIC INSTRUMENTATION EKT 314/4 Chapter 1 Introduction to EI Zahari Awang Ahmad

Loading in 2 Seconds...

- 90 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about ' ELECTRONIC INSTRUMENTATION EKT 314/4 Chapter 1 Introduction to EI Zahari Awang Ahmad' - halla-hobbs

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Introduction to Electronic Instrumentation

In this Chapter, we will cover the introduction to Electronic Instrumentation which include the following :

- Instrumentation Element
- Application Field
- Review

- Definition
- Measurement
- Analysis
- Direct Analysis.
- Statistical Analysis.

Definition of Electronic Instrumentation

Instrumentation is the branch of engineering that deals with measurement and control.

Instrumentation is defined as the art and science of measurement and control.[1]

It serves not only sciences but all branches of engineering, medicine, and almost every human endeavor.

Electronics Instrumentation is the application of measurement technology in electronic-related field.

[1] http://en.wikipedia.org/wiki/Instrumentation

Related Definition

Instrument

Measurement

Accuracy

A device or mechanism used to determine the present value of the quantity under measurement.

The process of determining the amount,

degree, or capacity by comparison (direct or indirect) with

the accepted standards of the system units being used.

The degree of exactness (closeness) of a

measurement compared to the expected (desired) value

Related Definition

The smallest change in a measured variable to which an instrument will respond.

Precision A measure of the consistency or repeatability of

measurements, i.e. successive reading do not differ. (Precision

is the consistency of the instrument output for a given value of input).

The design value, i.e. the most probable value that calculations indicate one should expect to measure.

Resolution

Precision

Expected value

Related Definition

Error

Sensitivity

The deviation of the true value from the desired value.

The ratio of the change in output (response) of the instrument to a change of input or measured variable.

Measurement

The process of comparing an unknown quantity with an accepted standard quantity.

The process of determining the amount, degree, or capacity

by comparison (direct or indirect) with the accepted

standards of the system units being used.

Measurand

Vector representing a change in position of a body or a point with respect to a reference.

Relative deformation of elastic, plastic, and fluid materials under applied forces.

Oscillatory motion which can be described in term of amplitude (size), frequency (rate of oscillation) and phase (timing of the oscillation relative to fixed time)

Displacement

Strain

Vibration

Measurand

Pressure

Flow

Temperature

Force

Ratio of force commonly acting on a surface to the area of the surface.

Stream of molten or liquidified material that can be measured in term of speed and quantity

Measure of relative warmth or coolness of an object compared to absolute value.

Defined as a quantity that changes the motion, size, or shape of a body.

Unit

International System of Units (abbreviated SI from the French le Système international d'unités)

It is the world's most widely used system of measurement, both in everyday commerce and in science.

The SI was developed in 1960 from the old metre-kilogram-

second system.

Base Unit

Length

Mass

Time

Electric current

Temperature

Luminous intensity

Amount of substance

Meter (m)

Kilogram (kg)

Second (s)

Ampere (A)

Kelvin (K)

Candela (cd)

Mole (mol)

Derivative Unit

Electric charge – coulomb (C)

Electric potential difference – volt (V)

Electric resistance – ohm (Ω)

Electric capacitance – farad (F)

Electric inductance – henry (H)

Energy – joule (J)

Force – newton (N)

Magnetic flux – weber (Wb)

Power – watt (W)

Direct Analysis - Terminology

- Error is the degree to which a measurement nears the expected value. It can be expressed as:
- Absolute error
- Percentage of error
- Percentage of error
- Accuracy can be calculated based on error.

Download Presentation

Connecting to Server..