lecture 15 l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Lecture 15 PowerPoint Presentation
Download Presentation
Lecture 15

Loading in 2 Seconds...

play fullscreen
1 / 26

Lecture 15 - PowerPoint PPT Presentation


  • 163 Views
  • Uploaded on

Lecture 15. Dimitar Stefanov. Multi multifunction control scheme for powered upper-limb prostheses. Some examples. Pattern-coded systems which use a M- vectors of unique values and a pattern classifier to produce M-prosthetic functions. Prosthetic hand (NTU-Hand)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Lecture 15' - Jims


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
lecture 15

Lecture 15

Dimitar Stefanov

slide2

Multi multifunction control scheme for powered upper-limb prostheses

Some examples

Pattern-coded systems which use a M- vectors of unique values and a pattern classifier to produce M-prosthetic functions.

slide3

Prosthetic hand (NTU-Hand)

National Taiwan University (NTU), Dept. of ME, Robotic Lab

http://robot0.me.ntu.edu.tw/english/E_Researches/medical/medical_res.htm

http://robot0.me.ntu.edu.tw/english/E_Researches/medical/prosthesis.htm

  • Developed by Dr. Li-ren Lin and Ji-Da Wu
  • A modular robotic hand
  • The degrees of freedom of the hand range from five to eleven
  • The weight of the NTU-Hand II - 1300 g
slide4

EMG signal processing

http://robot0.me.ntu.edu.tw/english/E_Researches/medical/EMG.htm

The EMG controller uses two pairs surface electrodes to acquire the EMG signal from the flexor digitorum superficialis muscle and the extensor pollicis brevis muscle.

Eight types of hand movements (Three-jaw chuck, lateral hand, hook grasp, power grasp, cylindrical grasp, centralized grip, flattened hand and finger flexion)

slide5

Used parameters: variance, zero-crossings, autoregressive model and spectral estimation.

The control method combined the pattern recognition technique and a pulse-coding analysis.

An error backpropagation neural network and a k-nearest-neighbor rule are applied to discriminate among the feature sets.

Experiments: PC + digital signal processor (DSP)

3-D graphic interface program.

slide6

17 degree of free in the five fingers

force sensors

8051

graphic shows that the hand grasps the egg.

http://robotweb.me.ntu.edu.tw/English/E_Researches/robotman/Image/5Finger.htm

slide7

Rutgers university (NJ)

http://uc.rutgers.edu/news/science/arthand.html

http://opl.rutgers.edu/opl.html

http://opl.rutgers.edu/opl.html

Rutgers University, Department of biomedical engineering in Piscataway, N.J.

William Craelius, an associate professor of biomedical engineering, PhD student Rochel Lieber Abboudi.

Multi-finger control and proportional control of force and velocity.

slide8

Three sensors inside the sleeve pick up natural motions of tendons and transmit them to a desktop computer which then control the fingers of the hand.

lack of fingers

  • Tendon Activated Pneumatic (TAP) control.
  • Based on sensing the command signals in the forearm by pressure sensors that are located in the limb socket.
  • The system detect specific finger movement requests and send them directly to small actuators that move fingers.
  • Biomimetic approach – the natural motor control system is used to activate the fingers (minimal learning time).
  • the TAP system is not appropriate for everyone.
  • tested on 12 people, 9 of whom were able to use it successfully.
  • several operable fingers with controlled grasping force.
slide10

"A Microprocessor-Based Multifunction Myoelectric Control System"by Kevin Englehart, Bernard Hudgins, Philip Parker and Robert N. Scott, Institute of Biomedical Engineering, University of New Brunswick. 

23rd Canadian Medical and Biological Engineering Society Conference "A Microprocessor-Based Multifunction Myoelectric Control System“, Toronto, Canada, May 1997

slide11

"A Microprocessor-Based Multifunction Myoelectric Control

  • The control scheme uses the myoelectric signals produced in the first two hundred milliseconds following a contraction in the muscles. 
  • This information is used to train a pattern classifier to recognize the specific pattern unique to the amputee, and to determine the intent of the amputee. 
  • The pattern classifier matches the pattern to select the device that is controlled, such as the hand, elbow, or wrist. 
  • Once a pattern is recognized, the actuator is activated until the input myoelectric signal goes below a certain threshold.
slide12

The controller operates in two modes:

  • a training mode
  • a normal mode.

The training mode or PC interface mode involves the use of a host PC for offline processing.

  • Steps of the training mode:
  • Specify myoelectric control parameters.
  • Collect the MES for the user.
  • Extract certain features from the MES, and store them for training the artificial neural network.
  • Train the pattern classifier to recognize the input MES. This requires a host computer for off-line processing.
  • Train the artificial neural network to obtain the weights. These weights are used to determine a match.
  • Store these weights in the device in non-volatile RAM.

The PC interface mode allows the control of a prosthesis by controlling a three-dimensional "virtual arm" on the host PC.

slide13

The normal mode:

  • MES is collected both from the biceps and from the triceps.
  • Extraction of features from the signals
  • Artificial neural network whose weightings were previously determined in the offline mode to identify the closest pattern match to drive the correct device.
  • The system have to be fast enough to respond within 300 milliseconds. A system response longer than this would cause the user would become frustrated, and to try other motions.
  • This response time is limiting because the system requires approximately 250 milliseconds to capture enough data in the MES for accurate pattern recognition. This leaves only 50 milliseconds for the processing and activation of the prosthetic device.
  • Even with these limitations, the current system has an over 90% accuracy in determining four types of prosthetic device motions.

The current prototype has an approximate size of 1.5" x 2.5" x 0.5" and operates on a 6V NiCad battery.

slide14

Multifunctional prosthetic control using the myoelectric power spectral density spectrum

Dr. Philip Parker, Jillian Mallory

The University of New Brunswick, Canada.

The system inputs the myoelectric signal, x(n), at a sampling frequency F, then extracts a feature vector of length N.

slide15

Multifunctional prosthetic control using the myoelectric power spectral density spectrum

The system inputs the myoelectric signal, x(n), at a sampling frequency F, then extracts a feature vector of length N.

The myoelectric signal power spectral density spectrum changes with variations in muscle contraction patterns.

The design and the implementation of a control input to a myoelectric control system can be based on the classification of these power spectrum patterns.

A feature vector corresponding to the changes in spectrum is extracted by the segmentation of the spectrum.

This vector is classified using a pattern classifier and its output is used for the prosthesis control.

slide16

Prosthetic Control by an EEG-based Brain-Computer Interface

Prosthetic Control by an EEG-based Brain-Computer Interface (BCI), Christoph Guger, Werner Harkam, Carin Hertnaes, Gert Pfurtscheller, Uniniversity of Technology Graz, AAATE99

http://www.fernuni-hagen.de/FTB/aaate99/paper/99_90/99_90.htm

  • It was shown recently that hand movement imagery results in EEG signals close to primary motor areas. An array of electrodes overlying motor and somatosensory areas (electrode positions C3 and C4).
  • Oscillatory EEG components are used for BCI
  • On-line analysis of EEG signals is required.
slide17

Test

  • Fixation cross was shown in the center of a monitor.
  • After two seconds a warning "beep" stimulus
  • From second 3 until 4.25 an arrow (cue stimulus), pointing to the left or right, was shown on the screen. The subject was instructed to imagine a left or right hand movement, depending on the direction of the arrow.
  • Between second 4.25 and 8 the EEG was classified on-line and the classification result was used to control the prosthesis.

If the person imagined a left movement, then the prosthesis was closed a little bit more and vice versa (correct classification assumed). One session consisted of 160 trials. Three sessions were made with subject i6.

slide18

BCI system allows to control a hand prosthesis by imagination of left and right hand movement.

A practical EMG-based human-computer interface for users with motor disabilities

Armando B. Barreto, PhD; Scott D. Scargle, MSEE; Malek Adjouadi, PhD

Journal of Rehabilitation Research and Development, Vol. 37 No. 1, January/February 2000

  • Computer interaction (UP, DOWN, LEFT, RIGHT, left-click) from monitoring the activity of several pericranial muscles.
  • Four electrodes are used
slide19

Intelligent prosthetic controller

DAISUKE NISHIKAWA, WENWEI YU, HIROSHI YOKOI,

YUKINORI KAKAZU

Lab. of Autonomous Systems Eng., Research Group of Complex Systems Eng.,

Graduate School of Eng., Hokkaido University.

The analyzing unit is based on the Wavelet transform using Gabor mother wavelet function into the analysis unit.

slide20

Pronation

  • Supination
  • Flexion
  • Extension
  • Grasp
  • Open
slide21

Development of Prosthetic Hand Using Adaptable Control Method for Human Characteristics, Sadao FUJII, Daisuke NISHIKAWA, Hiroshi YOKOI

Adaptation with Visual Feedback

slide25

Adaptation

Two Functions