Sponsored Links
This presentation is the property of its rightful owner.
1 / 120

触控与多点触控交互技术 PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

触控与多点触控交互技术. 王锋 昆明理工大学 E-mail: wangfeng@acm.org Phone: 13700600260. 提纲. 自然人机交互 (NUI) 触控技术的历史 触控研究的重点与目的 我们的工作. 人机交互 – UI. 用户界面 (UI) 技术是人机交互研究的热点问题之一。 一种新的人机交互设备的出现,都会引发用户界面技术的一次重要变化 用户界面在人机系统中负责计算机的输入和输出,产生必要的反馈,并直接影响最终用户对系统的使用。. What is HCI?.

Download Presentation


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

E-mail: wangfeng@acm.org

Phone: 13700600260

  • (NUI)


  • (UI)

What is HCI?

  • UserAm I error again? Computer for People? Or People for Computer???

Where are we going?

  • :


    • (CLI)

    • (GUI)

    • (NUI)

  • [1] J. Canny, "The Future of Human-Computer Interaction," Queue, vol. 4, pp. 24-32, 2006.

    [2] B. Myers, "A brief history of human-computer interaction technology," interactions, vol. 5, pp. 44-54, 1998.

  • Shneiderman

Why NUI?

  • GUI(MacOSWindows


  • UIAlan CoopeThe Essentials of Interaction Design

    • Mark Weiser,

  • NUI:

  • N. Cross, "Natural intelligence in design," Design Studies, vol. 20, pp. 25-39, 1999.

  • C. Nass and B. Reeves, "Social and natural interfaces: theory and design," in CHI '97 extended abstracts on Human factors in computing systems: looking to the future, Atlanta, Georgia, 1997, pp. 192-193.

Input Device - Vision/Goals (1945-2015)


  • Combined speech recognition, character recognition

  • Pen editing

  • Heuristic programming

  • Ubiquitous computing

  • Time sharing

  • Electronic I/O

  • Interactive, real- time system

  • Large scale information storage and retrieval

  • Natural language understanding

  • Speech recognition of arbitrary users

  • Natural user interface

  • Internet of things (C2C, H2C, H2H, H2C2H)

Input Devices (overview)

Sensor Devices

1. Spatial Position/Orientation Sensors

2DOF (Mouse)

3DOF (Microscribe, FreeD Joystick)

6DOF (Polhemus Fastrack)

2. Directional Force Sensors

5 DOF (Spacemouse)

2 DOF (Joystick)

3. Gesture Recognition

Data Gloves

4. Eye Tracking

5. Speech Recognition Systems

The First Mouse (1964)

Knee control

Douglas Engelbart

Years before personal computers and desktop information processing became commonplace or even practicable, Douglas Engelbart had invented a number of interactive, user-friendly information access systems that we take for granted today: the computer mouse, windows, shared-screen teleconferencing, hypermedia, groupware, and more.

Input Devices (1)

Directional Force Sensors


Input Devices (2)

Gesture Recognition

Dextrous Hand Master, Exos


Cyberglove , 5th Dimension

Input Devices (3)

Spatial Position/Orientation Sensors

Polhemus InsideTrack

(Magnetic Tracking)


(Mechanical Tracking)

FreeD Joystick

(UltraSonic Tracking)

Input Devices (4)

Visual Haptic Workbench

The Visual Haptic Workbench consists of five hardware components.

The dominant hand of the user experiences haptic feedback from the PHANToM, and the subdominant hand navigates through a menu interface via Pinch glove contact gestures. Head tracking is done with a Polhemus Fastrak receiver mounted on a pair of Stereographics CrystalEyes LCD shutter glasses. The subdominant hand can also be tracked with a separate receiver to facilitate more complex interaction paradigms. The audio subsystem gives the user additional reinforcement cues to clarify

[see also http://haptic.mech.nwu.edu/intro/gallery/]

Historical Overview (1945-1995)

[source: Brad A. Myers (1998). A brief history of human-computer interaction technology. Interactions, vol 5(2), pp. 44-54]

  • NUI

Touch / Multi-touch

    • Natural affordances

    • Offer a more compelling method to interact with a system than a mouse or other types of pointing devices

  • Nakatani(Soft Machine, 1983)

    • L. H. Nakatani and J. A. Rohrlich, "Soft machines: A philosophy of user-computer interface design," in Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Boston, Massachusetts, United States, 1983, pp. 19-23.

History of Multi-touch

  • 1982 - first multi-touch system called Flexible Machine interface developed in University of Toronto.

  • 1983 - Bell labs and Murray Hill published the first paper discussing touch-screen based interfaces.

Video Place

  • 1983 - Video Place /Video Desk (Myron Krueger) A vision based system that tracked the hands and enabled multiple fingers,hands, and people to interact using a rich set of gestures

  • Myrons work had a staggeringly rich repertoire of gestures, muti-finger, multi-hand and multi-person interaction.

History of Multi-touch

  • 1984 Multi-touch Screen (Bell labs, Murray Hill NJ) - integrated with a CRT on an interactive graphics terminal, could manipulate graphical objects with fingers with excellent response time.

  • Myrons work had a staggeringly rich repertoire of gestures, muti-finger, multi-hand and multi-person interaction.

  • Microsoft began research in this area..

Multitouch Tablet 1985

  • Input Research Group, University of Toronto

  • Touch tablet capable of sensing an arbitrary number of simultaneous touch inputs, reporting both location and degree of touch for each

Sensor Frame (Carnegie Mellon University)

  • The device used optical sensors in the corners of the frame to detect fingers.

  • At the time that this was done, miniature cameras were essentially unavailable. Hence, the device used DRAM IC's with glass (as opposed to opaque) covers for imaging.

  • It could sense up to three fingers at a time fairly reliably (but due to optical technique used, there was potential for misreadings due to shadows.

1986 Bi Manual Input

  • 1986 Bi Manual Input (University of Toronto) - able to position/scale task and selection/navigate task

Apple Desktop Bus

  • 1986 Apple Desktop Bus (ADB) and the trackball scroller Init(Apple Computer/University of Toronto)

  • E.g. The macintosh II and macintosh SE

Digital Desk 1991

  • (Pierre Wellner, Rank Xerox EuroPARC, Cambridge) - supported multi-finger and pinching motions (leads to moderm product e.g. Iphone)

Flip Keyboard - 1992

  • Bill Buxton, Xerox PARC

  • A multi-touch pad integrated into the bottom of a keyboard. You flip the keyboard to gain access to the multi-touch pad for rich gestural control of applications.

  • Graphics on multi-touch surface defining controls for various virtual devices.

History of Multi-touch

  • 1992: Simon (IBM & Bell South) - A single-touch device relied on a touch-screen driven soft machine

  • 1992: Wacom (Japan) - tablet that had multi-device/multi-point sensing capability

Starfire - inconceivable [5]

  • 1992: Starfire (Bruce Tognazinni , SUN Microsystems) - Bruce Tognazinni produced an future envisionment film, Starfire, that included a number of multi-hand, multi-finger interactions, including pinching, etc.

Starfile Video

Bimanual 1994-2002

  • Alias|Wavefront Toronto

  • Developed a number of innovative techniques for multi-point / multi-handed input for rich manipulation of graphics and other visually represented objects

Graspable/Tangible Interfaces 1995

  • Input Research Group, University of Toronto

  • Demonstrated concept and later implementation of sensing the identity, location and even rotation of multiple physical devices on a digital desk-top display and using them to control graphical objects.

Active Desk 1995/97

  • Input Research Group / Ontario Telepresence Project,University of Toronto

  • Simultaneous bimanual and multi-finger interaction on large interactive display surface

T3 Wavefront 1997

  • A bimanual tablet-based system

  • Utilized a number of techniques that work equally well on multi-touch devices

Haptic Lens 1997

  • By Mike Sinclair, Georgia Tech / Microsoft Research

  • A multi-touch sensor that had the feel of clay, in that it deformed the harder you pushed, and resumed it basic form when released. A novel and very interesting approach to this class of device.

Fingerworks 1998

  • Inventor: Newark, Delaware

  • Produced a line of multi-touch products including the iGesture Pad. They supported a fairly rich library of multi-point / multi-finger gestures.

Portfolio Wall 1999

  • Alias|Wavefront,Toronto On, Canada

  • A digital cork-board on which images could be presented as a group or individually.

  • Its interface was entirely based on finger touch gestures that went well beyond typical touch screen interfaces.

Diamond Touch 2001 [7]

  • Mitsubishi Research Labs,Cambridge MA)

  • Capable of distinguishing which persons fingers/hands are which, as well as location and pressure

SmartSkin Sony 2002

  • An architecture for making interactive surfaces that are sensitive to human hand and finger gestures.

  • This sensor recognizes multiple hand positions and their shapes by using capacitive sensing and a mesh-shaped antenna.

  • In contrast to camera-based gesture recognition systems, all sensing elements can be integrated within the surface, and this method does not suffer from lighting and occlusion problems.

Jeff Han

  • FTIR multi-touch.

  • Very elegent implementation of a number of techniques and applications on a table format rear projection surface.

Video of Han

2007-2010 Iphone4

iPhone& MacBook Air

Microsoft Surface


Computer Vision Based



Historical Overview (1945-1995)

[source: Brad A. Myers (1998). A brief history of human-computer interaction technology. Interactions, vol 5(2), pp. 44-54]

  • NUI


Surface Acoustic Wave

Strain gauge

Optical Imaging

Dispersive Signal Technology

Acoustic Pulse

Frustrated Total Internal Reflection (FTIR)

Diffused Illumination (DI)


Shadow Capture


Frustrated Total Internal Reflection

  • Light emitting diodes produce light waves that travel through an acrylic touch screen. Under normal conditions, the light waves stay within the acrylic pane; however, when an object presses on the acrylic the light is scattered downward.

Diffused Illumination

  • An LED light source is aimed at the screen. When objects touch the acrylic surface, the light reflects back and is picked up by multiple infrared cameras.

  • DI has the added capability for object detection.

  • Most often used in high end systems.

Capacitive Multi-Touch

  • Many layers of protective, bonding, driving, sensing, substrate, and display surfaces

  • Two methods of picking up signal, Mutual capacitance or self capacitance

  • Sensor's 'normal' capacitance field is altered by another capacitance field from the human finger, electronic circuits measure the resultant distortion

Shadow Capture

  • Under normal conditions, light continually travels through the acrylic pane and reaches the camera; however, when an object presses on the acrylic a shadow is cast downward.

Architectural Overview

  • A camera is used to pick up the light reflected when a contact is made with the acrylic screen. The camera is attached to a computer running software that reacts to each touch.

  • A rear projector is used to display the output from the computer. The screen serves two general purposes: to display the image from the projector, and also to absorb all of the light from the projector so that the user on the other side of the acrylic is not blinded.


  • Any multi-touch interface requires computer software to manipulate the detected touches received by the camera. Additional programs are then needed to respond to each input and produce a desired output.

  • Touchlib is an open source library for creating multi-touch interaction surfaces. It handles tracking blobs of infrared light and sends programs multi-touch events, such as 'finger down', 'finger moved', and 'finger released'. It interfaces with most major types of webcams and video capture devices.

  • Flash open sound control (FLOSC) is also used to connect Touchlib, which sends out OSC messages.

  • Programs created for multi-touch are generally written in a form of C or flash.

  • http://www.nuigroup.com

  • NUI

    • Wu

    • M. Wu and R. Balakrishnan, "Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays," in Proceedings of the 16th annual ACM symposium on User interface software and technology, Vancouver, Canada, 2003, pp. 193-202.

  • Wobbrock

  • J. O. Wobbrock, M. R. Morris, and A. D. Wilson, "User-defined gestures for surface computing," in Proceedings of the 27th international conference on Human factors in computing systems Boston, MA, USA: ACM, 2009.

  • Epps

  • J. Epps, S. Lichman, and M. Wu, "A study of hand shape use in tabletop gesture interaction," in CHI '06 extended abstracts on Human factors in computing systems Montr\&\#233;al, Qu\&\#233;bec, Canada: ACM, 2006, pp. 748-753.

    • (Fat Finger)

  • 1) : Potter1988, Land-OnFirst-Contact

  • 95%1.2

  • KabbashBuxtonArea CursorWorden

  • :PotterTake-OffTake-OffVogelBaudischShift(CallOut)

  • -:

    • OlwalFeiner-(Control-Display ratio)

    • Benko(Dual Finger Stretch)X(Dual Finger X-Menu)-

  • (Widget): AlbinssonZhai1


  • CHI10 Astrophysics

  • (NUI)

Drawbacks of Current System

  • Mainly based on point information.

    • Primarily use the center coordinates of the human fingers contact region.

    • Most interactions mainly rely on touch positions or variations in touch movements.

    • Relatively few research demonstrations have used auxiliary information other than touch position,

      • the shape

      • size of the contact region

Same as Pen.

Pressure, Tilt, Azimuth and Twist

widen the input bandwidth




(22~ 90)


(0~ 360)


(0~ 360)

The number of degrees of freedom (DOF) :


The number of finger input properties typically used in current touch sensing interaction techniques :

the (x, y) coordinates

Properties of human hand

Four aspects of finger properties

  • Position property

  • Motion property

  • Physical property

  • Event property

Finger properties in our research

  • Orientation

  • Contact area

  • Contact shape

Four tasks in our research








Apparatus of experiment

  • FTIR based widget

  • Color printed A4 sheet of white paper as operational interface

  • Camera: Philips SPC900NC

  • Resolution: 640*480

  • scale of the system in the x and y axes: 0.4 mm/pixel

  • 30 frames in one second


  • Number of the participants: 12 (8 male, 4 female)

  • Age : 26-37

  • All right-handed

  • a little experience of using touch devices

  • The average physical width (W) and length (L) of finger tips (unit: mm)

Number of trials

  • 12 participants

    * 5 fingers (Thumb, Index, Middle, Ring, little)

    *4 tasks (vertical tapping, oblique tapping, finger rocking, finger orientation rotation)

    * 6 repetitions

    =1440 trials


  • 1. Three states in a touch

  • 2. Tapping Precision

  • 3. Finger Touch Area Shape

  • 4. Finger Touch Area size

  • 5. Finger Touch Area Orientation

Three states in a touch

  • Three states in a touch

    • Land on state

    • Stable state

    • Lift up state

  • How to determine the stable state

    • The width of the contact area is greater than a predetermined threshold

    • The length is greater than the width

Implications for design: three states in a touch

  • The first touch coordinate cannot be treated as the final touch position

  • Use the coordinates derived from the Stable state rather than from the Land On or Lift Up states

Empirical Evaluation for Finger Input Properties In Multi-touch Interaction. Feng Wang & Xiangshi Ren

Tapping precision: Scatter diagrams and normal distributions diagrams

  • (a): Scatter diagrams for the index finger in the vertical touch

  • (b): Scatter diagrams for the index finger in the oblique touch

  • (c): The distance from stable position to the target for the index finger in the vertical touch

  • (d): The distance from stable position to the target for the index finger in the oblique touch





Tapping precision: the tapping deviation data for the five fingers

ULCI: upper level of 95% confidence interval (unit: pixels, scale = 0.4 mm/pixel)

Shape of Finger Touch Area

  • The shape of the contact area can be approximately represented by the equation of a rectangle or an ellipse

  • Three parameters: width (minor axis), length (long axis), slant angle

(unit: VA and OA = pixels2, Range of orientation = degrees)

The Size and Orientation of Finger Touch Area

  • The OA is at least 5.5 times the VA

  • The range of orientation is more than 100 degrees


  • One potentially accessible piece

    • Orientation is a natural cue as it provides the direction a user is pointing in and is used

  • Orientation vector

    • Direction

    • An angle from a point of reference

  • Ambiguous (not in MS Surface, DI)

Our Goal

  • Present a novel and robust algorithm that accurately and unambiguously detects the orientation vector by considering the dynamics in finger contact.

  • Based on contactinformation only.


  • Detects the directed orientation vector of the users finger, based on the shape of the finger contact

  • Two types of finger touch on interactive surfaces

    • vertical touch

    • oblique touch

Fitting Contact Shape

  • Algorithm

    • Fitting Contact Shape

      • Elliptical Equation Fitting -> Length, Width

    • Identifying Oblique Touch

      • Area (>=120 mm2)

      • Aspect (>=1.2)

    • Disambiguating Finger Direction

Disambiguating Finger Direction

  • The human finger has soft and deformable tissues.

  • Distortion of the finger muscle is inevitable upon contact

  • It is apparent that the center point of the finger contact moves inward, towards the users palm.

  • Variation of the contact center between two consecutive frames t-1 (blue) and t (red).

  • Frame t-1 is the last frame of non-oblique touch state and frame t is the first frame of oblique touch state.

Continual Orientation Tracking

  • In every subsequent frame t+1, the directed finger orientation in the previous frame (t) is used as the cue to disambiguate the current undirected finger orientation (t+1)


  • Goal

    • Evaluation to assess the performance of our finger orientation detection algorithm stability and precision in determining orientation of static and dynamic fingers.

  • Task

    • Four tasks, each examining a different aspect of the algorithm.

    • No any visual feedback concerning the orientation of the finger as detected by the algorithm.

    • As a result, participants had to completely rely on their subjective perception of the finger orientation.

  • Apparatus

    • Direct-touch tabletop surface based on FTIR.

    • A camera with a resolution of 640480 pixels and at a capture rate of 30 fps.

  • Participants

    • Eight volunteers

  • Task 1: Static Orientation Stability.

  • Task 2: Static Orientation Precision (165, 150, 135, and 120 )

  • Task 3: Dynamic Orientation Precision.

  • Task 4 : Involuntary Position Variation in Rotation.


  • Algorithm is very stable.

  • The average variation range during each finger dwelling period is 0.59 (std. dev.= 0.15), in practice we can ignore finger orientation changes that are less than 1.

  • Our algorithm can provide a static precision within approximately 5. Across the complete 360orientation range, this gives 36 usable orientation levels.

  • The average dynamic orientation error (absolute value) is 14.35 (std. dev. = 9.53).

  • The average position variation during finger rotation (approximate 2.00mm )

  • For all tasks, the disambiguation algorithm generated 13 errors in total.

  • This resulted in a success rate of 96.7% (384 out of 397 trials), indicating good performance of the algorithm.

Finger Orientation Design Impact

  • Interactions Using Finger Orientation

    • Enhancing Target Acquisition

    • Orientation-Sensitive Widgets

  • Inferences from Orientation Information

    • Estimating Occlusion Region

    • Inferring User Position

    • Relationship between Multiple Fingers

    • Enabling Orientation-Invariant Input

Interactions Using Finger Orientation

  • Enhancing Target Acquisition

    • Directed Bubble Cursor

      • applying different multiplying weights to the target distances based on the targets relative azimuth angle compared to the finger center and orientation.

Interactions Using Finger Orientation

  • Enhancing Target Acquisition

    • Aim and grab

Interactions Using Finger Orientation

  • Orientation-Sensitive Widgets

Inferences From Finger Orientation

  • Estimating Occlusion Region

    • occlusion region : a circular sector opposite to the finger orientation , with the vertex at the center of the finger tip (x, y)

    • the central angle at approximately = 60

Inferences From Finger Orientation

  • Inferring User Position

    • The user sits along either one of the two long sides of the tabletop.

    • Knowing the orientation of the finger touch, we can infer that the operating user is sitting at the side opposite to the finger orientation.

Inferences From Finger Orientation

  • Relationship between Multiple Fingers

    • Exploit this information to infer the relationship between multiple fingers on the surface.

    • This location is usually to the opposite side of the directions pointed by all finger, and within a reasonable distance from the position of the fingertips

Inferences From Finger Orientation

  • we calculate the intersection point I of the two straight lines aligned with their positions and orientations.

Inferences From Finger Orientation

  • Enabling Orientation-Invariant Input

    • The orientation of the input gesture can be normalized by a compensated rotation determined by the average finger orientation while performing the gesture.

  • Multi-finger mouse emulation

    • Matejka et al. presented SDMouse

    • By considering the orientation of the index finger, we can unambiguously associate fingers to buttons located in a reachable location regardless of the users position

Algorithm Limitations

  • Assumes an oblique touch

  • The orientation disambiguation step relies on the finger center displacement during the finger landing process and assumes that this displacement is caused solely by the deformation of the finger.

  • All fingers except thumb

Technology Compatibility

  • Variety of other sensing technologies

    • Capacity-based sensing

    • Embedded optical sensor arrays

    • FTIR-based devices

  • 1WIMP

  • 3(seamless interaction)(invisibility)(direct manipulation)

  • 4

  • Thanks for your attention.

  • !

  • 13700600260

  • QQ: 1946899

  • Skype: cnwangfeng

  • E-mail: wangfeng@acm.org

  • Login