This presentation is the property of its rightful owner.
Sponsored Links
1 / 120

触控与多点触控交互技术 PowerPoint PPT Presentation


  • 187 Views
  • Uploaded on
  • Presentation posted in: General

触控与多点触控交互技术. 王锋 昆明理工大学 E-mail: [email protected] Phone: 13700600260. 提纲. 自然人机交互 (NUI) 触控技术的历史 触控研究的重点与目的 我们的工作. 人机交互 – UI. 用户界面 (UI) 技术是人机交互研究的热点问题之一。 一种新的人机交互设备的出现,都会引发用户界面技术的一次重要变化 用户界面在人机系统中负责计算机的输入和输出,产生必要的反馈,并直接影响最终用户对系统的使用。. What is HCI?.

Download Presentation

触控与多点触控交互技术

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


1698707

E-mail: [email protected]

Phone: 13700600260


1698707

  • (NUI)


1698707

UI

  • (UI)


What is hci

What is HCI?

  • UserAm I error again? Computer for People? Or People for Computer???


Where are we going

Where are we going?


1698707

  • :

    • (BATCH PROCESSING)

    • (CLI)

    • (GUI)

    • (NUI)


1698707

  • [1] J. Canny, "The Future of Human-Computer Interaction," Queue, vol. 4, pp. 24-32, 2006.

    [2] B. Myers, "A brief history of human-computer interaction technology," interactions, vol. 5, pp. 44-54, 1998.


1698707

  • Shneiderman


Why nui

Why NUI?

  • GUI(MacOSWindows

  • GUIWIMPWIMP


1698707

  • UIAlan CoopeThe Essentials of Interaction Design


1698707

    • Mark Weiser,


1698707

  • NUI:

  • N. Cross, "Natural intelligence in design," Design Studies, vol. 20, pp. 25-39, 1999.

  • C. Nass and B. Reeves, "Social and natural interfaces: theory and design," in CHI '97 extended abstracts on Human factors in computing systems: looking to the future, Atlanta, Georgia, 1997, pp. 192-193.


Input device vision goals 1945 2015

Input Device - Vision/Goals (1945-2015)

ImmediateIntermediateLong-term

  • Combined speech recognition, character recognition

  • Pen editing

  • Heuristic programming

  • Ubiquitous computing

  • Time sharing

  • Electronic I/O

  • Interactive, real- time system

  • Large scale information storage and retrieval

  • Natural language understanding

  • Speech recognition of arbitrary users

  • Natural user interface

  • Internet of things (C2C, H2C, H2H, H2C2H)


Input devices overview

Input Devices (overview)

Sensor Devices

1. Spatial Position/Orientation Sensors

2DOF (Mouse)

3DOF (Microscribe, FreeD Joystick)

6DOF (Polhemus Fastrack)

2. Directional Force Sensors

5 DOF (Spacemouse)

2 DOF (Joystick)

3. Gesture Recognition

Data Gloves

4. Eye Tracking

5. Speech Recognition Systems


The first mouse 1964

The First Mouse (1964)

Knee control

Douglas Engelbart

Years before personal computers and desktop information processing became commonplace or even practicable, Douglas Engelbart had invented a number of interactive, user-friendly information access systems that we take for granted today: the computer mouse, windows, shared-screen teleconferencing, hypermedia, groupware, and more.


Input devices 1

Input Devices (1)

Directional Force Sensors

SpaceMaster


Input devices 2

Input Devices (2)

Gesture Recognition

Dextrous Hand Master, Exos

SUPERGLOVE, Nissho

Cyberglove , 5th Dimension


Input devices 3

Input Devices (3)

Spatial Position/Orientation Sensors

Polhemus InsideTrack

(Magnetic Tracking)

MicroScribe

(Mechanical Tracking)

FreeD Joystick

(UltraSonic Tracking)


Input devices 4

Input Devices (4)

Visual Haptic Workbench

The Visual Haptic Workbench consists of five hardware components.

The dominant hand of the user experiences haptic feedback from the PHANToM, and the subdominant hand navigates through a menu interface via Pinch glove contact gestures. Head tracking is done with a Polhemus Fastrak receiver mounted on a pair of Stereographics CrystalEyes LCD shutter glasses. The subdominant hand can also be tracked with a separate receiver to facilitate more complex interaction paradigms. The audio subsystem gives the user additional reinforcement cues to clarify

[see also http://haptic.mech.nwu.edu/intro/gallery/]


Historical overview 1945 1995

Historical Overview (1945-1995)

[source: Brad A. Myers (1998). A brief history of human-computer interaction technology. Interactions, vol 5(2), pp. 44-54]


1698707

  • NUI


Touch multi touch

Touch / Multi-touch

    • Natural affordances

    • Offer a more compelling method to interact with a system than a mouse or other types of pointing devices


1698707

  • Nakatani(Soft Machine, 1983)

    • L. H. Nakatani and J. A. Rohrlich, "Soft machines: A philosophy of user-computer interface design," in Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Boston, Massachusetts, United States, 1983, pp. 19-23.


History of multi touch

History of Multi-touch

  • 1982 - first multi-touch system called Flexible Machine interface developed in University of Toronto.

  • 1983 - Bell labs and Murray Hill published the first paper discussing touch-screen based interfaces.


Video place

Video Place

  • 1983 - Video Place /Video Desk (Myron Krueger) A vision based system that tracked the hands and enabled multiple fingers,hands, and people to interact using a rich set of gestures

  • Myrons work had a staggeringly rich repertoire of gestures, muti-finger, multi-hand and multi-person interaction.


History of multi touch1

History of Multi-touch

  • 1984 Multi-touch Screen (Bell labs, Murray Hill NJ) - integrated with a CRT on an interactive graphics terminal, could manipulate graphical objects with fingers with excellent response time.

  • Myrons work had a staggeringly rich repertoire of gestures, muti-finger, multi-hand and multi-person interaction.

  • Microsoft began research in this area..


Multitouch tablet 1985

Multitouch Tablet 1985

  • Input Research Group, University of Toronto

  • Touch tablet capable of sensing an arbitrary number of simultaneous touch inputs, reporting both location and degree of touch for each


Sensor frame carnegie mellon university

Sensor Frame (Carnegie Mellon University)

  • The device used optical sensors in the corners of the frame to detect fingers.

  • At the time that this was done, miniature cameras were essentially unavailable. Hence, the device used DRAM IC's with glass (as opposed to opaque) covers for imaging.

  • It could sense up to three fingers at a time fairly reliably (but due to optical technique used, there was potential for misreadings due to shadows.


1986 bi manual input

1986 Bi Manual Input

  • 1986 Bi Manual Input (University of Toronto) - able to position/scale task and selection/navigate task


Apple desktop bus

Apple Desktop Bus

  • 1986 Apple Desktop Bus (ADB) and the trackball scroller Init(Apple Computer/University of Toronto)

  • E.g. The macintosh II and macintosh SE


Digital desk 1991

Digital Desk 1991

  • (Pierre Wellner, Rank Xerox EuroPARC, Cambridge) - supported multi-finger and pinching motions (leads to moderm product e.g. Iphone)


Flip keyboard 1992

Flip Keyboard - 1992

  • Bill Buxton, Xerox PARC

  • A multi-touch pad integrated into the bottom of a keyboard. You flip the keyboard to gain access to the multi-touch pad for rich gestural control of applications.

  • Graphics on multi-touch surface defining controls for various virtual devices.


History of multi touch2

History of Multi-touch

  • 1992: Simon (IBM & Bell South) - A single-touch device relied on a touch-screen driven soft machine

  • 1992: Wacom (Japan) - tablet that had multi-device/multi-point sensing capability


Starfire inconceivable 5

Starfire - inconceivable [5]

  • 1992: Starfire (Bruce Tognazinni , SUN Microsystems) - Bruce Tognazinni produced an future envisionment film, Starfire, that included a number of multi-hand, multi-finger interactions, including pinching, etc.


Starfile video

Starfile Video


Bimanual 1994 2002

Bimanual 1994-2002

  • Alias|Wavefront Toronto

  • Developed a number of innovative techniques for multi-point / multi-handed input for rich manipulation of graphics and other visually represented objects


Graspable tangible interfaces 1995

Graspable/Tangible Interfaces 1995

  • Input Research Group, University of Toronto

  • Demonstrated concept and later implementation of sensing the identity, location and even rotation of multiple physical devices on a digital desk-top display and using them to control graphical objects.


Active desk 1995 97

Active Desk 1995/97

  • Input Research Group / Ontario Telepresence Project,University of Toronto

  • Simultaneous bimanual and multi-finger interaction on large interactive display surface


T3 wavefront 1997

T3 Wavefront 1997

  • A bimanual tablet-based system

  • Utilized a number of techniques that work equally well on multi-touch devices


Haptic lens 1997

Haptic Lens 1997

  • By Mike Sinclair, Georgia Tech / Microsoft Research

  • A multi-touch sensor that had the feel of clay, in that it deformed the harder you pushed, and resumed it basic form when released. A novel and very interesting approach to this class of device.


Fingerworks 1998

Fingerworks 1998

  • Inventor: Newark, Delaware

  • Produced a line of multi-touch products including the iGesture Pad. They supported a fairly rich library of multi-point / multi-finger gestures.


Portfolio wall 1999

Portfolio Wall 1999

  • Alias|Wavefront,Toronto On, Canada

  • A digital cork-board on which images could be presented as a group or individually.

  • Its interface was entirely based on finger touch gestures that went well beyond typical touch screen interfaces.


Diamond touch 2001 7

Diamond Touch 2001 [7]

  • Mitsubishi Research Labs,Cambridge MA)

  • Capable of distinguishing which persons fingers/hands are which, as well as location and pressure


Smartskin sony 2002

SmartSkin Sony 2002

  • An architecture for making interactive surfaces that are sensitive to human hand and finger gestures.

  • This sensor recognizes multiple hand positions and their shapes by using capacitive sensing and a mesh-shaped antenna.

  • In contrast to camera-based gesture recognition systems, all sensing elements can be integrated within the surface, and this method does not suffer from lighting and occlusion problems.


Jeff han

Jeff Han

  • FTIR multi-touch.

  • Very elegent implementation of a number of techniques and applications on a table format rear projection surface.


Video of han

Video of Han


2007 2010 iphone4

2007-2010 Iphone4


Iphone macbook air

iPhone& MacBook Air


Microsoft surface

Microsoft Surface


Video

Video


Computer vision based

Computer Vision Based

FTIR

SURFACE


Historical overview 1945 19951

Historical Overview (1945-1995)

[source: Brad A. Myers (1998). A brief history of human-computer interaction technology. Interactions, vol 5(2), pp. 44-54]


1698707


1698707

  • NUI


Implementations

Resistive

Surface Acoustic Wave

Strain gauge

Optical Imaging

Dispersive Signal Technology

Acoustic Pulse

Frustrated Total Internal Reflection (FTIR)

Diffused Illumination (DI)

Capacitive

Shadow Capture

-Implementations


Frustrated total internal reflection

Frustrated Total Internal Reflection

  • Light emitting diodes produce light waves that travel through an acrylic touch screen. Under normal conditions, the light waves stay within the acrylic pane; however, when an object presses on the acrylic the light is scattered downward.


Diffused illumination

Diffused Illumination

  • An LED light source is aimed at the screen. When objects touch the acrylic surface, the light reflects back and is picked up by multiple infrared cameras.

  • DI has the added capability for object detection.

  • Most often used in high end systems.


Capacitive multi touch

Capacitive Multi-Touch

  • Many layers of protective, bonding, driving, sensing, substrate, and display surfaces

  • Two methods of picking up signal, Mutual capacitance or self capacitance

  • Sensor's 'normal' capacitance field is altered by another capacitance field from the human finger, electronic circuits measure the resultant distortion


Shadow capture

Shadow Capture

  • Under normal conditions, light continually travels through the acrylic pane and reaches the camera; however, when an object presses on the acrylic a shadow is cast downward.


Architectural overview

Architectural Overview

  • A camera is used to pick up the light reflected when a contact is made with the acrylic screen. The camera is attached to a computer running software that reacts to each touch.

  • A rear projector is used to display the output from the computer. The screen serves two general purposes: to display the image from the projector, and also to absorb all of the light from the projector so that the user on the other side of the acrylic is not blinded.


Software

Software

  • Any multi-touch interface requires computer software to manipulate the detected touches received by the camera. Additional programs are then needed to respond to each input and produce a desired output.

  • Touchlib is an open source library for creating multi-touch interaction surfaces. It handles tracking blobs of infrared light and sends programs multi-touch events, such as 'finger down', 'finger moved', and 'finger released'. It interfaces with most major types of webcams and video capture devices.

  • Flash open sound control (FLOSC) is also used to connect Touchlib, which sends out OSC messages.

  • Programs created for multi-touch are generally written in a form of C or flash.

  • http://www.nuigroup.com


1698707

  • NUI


1698707

    • Wu

    • M. Wu and R. Balakrishnan, "Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays," in Proceedings of the 16th annual ACM symposium on User interface software and technology, Vancouver, Canada, 2003, pp. 193-202.


1698707

  • Wobbrock

  • J. O. Wobbrock, M. R. Morris, and A. D. Wilson, "User-defined gestures for surface computing," in Proceedings of the 27th international conference on Human factors in computing systems Boston, MA, USA: ACM, 2009.


1698707

  • Epps

  • J. Epps, S. Lichman, and M. Wu, "A study of hand shape use in tabletop gesture interaction," in CHI '06 extended abstracts on Human factors in computing systems Montr\&\#233;al, Qu\&\#233;bec, Canada: ACM, 2006, pp. 748-753.


1698707


1698707

    • (Fat Finger)


1698707

  • 1) : Potter1988, Land-OnFirst-Contact

  • 95%1.2

  • KabbashBuxtonArea CursorWorden


1698707

  • :PotterTake-OffTake-OffVogelBaudischShift(CallOut)


1698707

  • -:

    • OlwalFeiner-(Control-Display ratio)

    • Benko(Dual Finger Stretch)X(Dual Finger X-Menu)-


1698707

  • (Widget): AlbinssonZhai1


Multi touch

Multi-touch

  • CHI10 Astrophysics


1698707

  • (NUI)


Drawbacks of current system

Drawbacks of Current System

  • Mainly based on point information.

    • Primarily use the center coordinates of the human fingers contact region.

    • Most interactions mainly rely on touch positions or variations in touch movements.

    • Relatively few research demonstrations have used auxiliary information other than touch position,

      • the shape

      • size of the contact region


Same as pen

Same as Pen.

Pressure, Tilt, Azimuth and Twist

widen the input bandwidth

Pressure

(0~1023)

Tilt

(22~ 90)

Azimuth

(0~ 360)

Twist

(0~ 360)


Properties of human hand

The number of degrees of freedom (DOF) :

23

The number of finger input properties typically used in current touch sensing interaction techniques :

the (x, y) coordinates

Properties of human hand


Four aspects of finger properties

Four aspects of finger properties

  • Position property

  • Motion property

  • Physical property

  • Event property


Finger properties in our research

Finger properties in our research

  • Orientation

  • Contact area

  • Contact shape


Four tasks in our research

Four tasks in our research

Vertical

tapping

Rocking

Orientation

Rotation

Oblique

tapping


Apparatus of experiment

Apparatus of experiment

  • FTIR based widget

  • Color printed A4 sheet of white paper as operational interface

  • Camera: Philips SPC900NC

  • Resolution: 640*480

  • scale of the system in the x and y axes: 0.4 mm/pixel

  • 30 frames in one second


Participants

Participants

  • Number of the participants: 12 (8 male, 4 female)

  • Age : 26-37

  • All right-handed

  • a little experience of using touch devices

  • The average physical width (W) and length (L) of finger tips (unit: mm)


Number of trials

Number of trials

  • 12 participants

    * 5 fingers (Thumb, Index, Middle, Ring, little)

    *4 tasks (vertical tapping, oblique tapping, finger rocking, finger orientation rotation)

    * 6 repetitions

    =1440 trials


Results

Results

  • 1. Three states in a touch

  • 2. Tapping Precision

  • 3. Finger Touch Area Shape

  • 4. Finger Touch Area size

  • 5. Finger Touch Area Orientation


Three states in a touch

Three states in a touch

  • Three states in a touch

    • Land on state

    • Stable state

    • Lift up state

  • How to determine the stable state

    • The width of the contact area is greater than a predetermined threshold

    • The length is greater than the width


Implications for design three states in a touch

Implications for design: three states in a touch

  • The first touch coordinate cannot be treated as the final touch position

  • Use the coordinates derived from the Stable state rather than from the Land On or Lift Up states

Empirical Evaluation for Finger Input Properties In Multi-touch Interaction. Feng Wang & Xiangshi Ren


Tapping precision scatter diagrams and normal distributions diagrams

Tapping precision: Scatter diagrams and normal distributions diagrams

  • (a): Scatter diagrams for the index finger in the vertical touch

  • (b): Scatter diagrams for the index finger in the oblique touch

  • (c): The distance from stable position to the target for the index finger in the vertical touch

  • (d): The distance from stable position to the target for the index finger in the oblique touch

(a)

(b)

(c)

(d)


Tapping precision the tapping deviation data for the five fingers

Tapping precision: the tapping deviation data for the five fingers

ULCI: upper level of 95% confidence interval (unit: pixels, scale = 0.4 mm/pixel)


Shape of finger touch area

Shape of Finger Touch Area

  • The shape of the contact area can be approximately represented by the equation of a rectangle or an ellipse

  • Three parameters: width (minor axis), length (long axis), slant angle


The size and orientation of finger touch area

(unit: VA and OA = pixels2, Range of orientation = degrees)

The Size and Orientation of Finger Touch Area

  • The OA is at least 5.5 times the VA

  • The range of orientation is more than 100 degrees


Orientation

Orientation

  • One potentially accessible piece

    • Orientation is a natural cue as it provides the direction a user is pointing in and is used

  • Orientation vector

    • Direction

    • An angle from a point of reference

  • Ambiguous (not in MS Surface, DI)


Our goal

Our Goal

  • Present a novel and robust algorithm that accurately and unambiguously detects the orientation vector by considering the dynamics in finger contact.

  • Based on contactinformation only.


Finger orientation detection algorithm

FINGER ORIENTATION DETECTION ALGORITHM

  • Detects the directed orientation vector of the users finger, based on the shape of the finger contact

  • Two types of finger touch on interactive surfaces

    • vertical touch

    • oblique touch


Fitting contact shape

Fitting Contact Shape

  • Algorithm

    • Fitting Contact Shape

      • Elliptical Equation Fitting -> Length, Width

    • Identifying Oblique Touch

      • Area (>=120 mm2)

      • Aspect (>=1.2)

    • Disambiguating Finger Direction


Disambiguating finger direction

Disambiguating Finger Direction

  • The human finger has soft and deformable tissues.

  • Distortion of the finger muscle is inevitable upon contact

  • It is apparent that the center point of the finger contact moves inward, towards the users palm.


1698707

  • Variation of the contact center between two consecutive frames t-1 (blue) and t (red).

  • Frame t-1 is the last frame of non-oblique touch state and frame t is the first frame of oblique touch state.


Continual orientation tracking

Continual Orientation Tracking

  • In every subsequent frame t+1, the directed finger orientation in the previous frame (t) is used as the cue to disambiguate the current undirected finger orientation (t+1)


Performance evaluation

PERFORMANCE EVALUATION

  • Goal

    • Evaluation to assess the performance of our finger orientation detection algorithm stability and precision in determining orientation of static and dynamic fingers.

  • Task

    • Four tasks, each examining a different aspect of the algorithm.

    • No any visual feedback concerning the orientation of the finger as detected by the algorithm.

    • As a result, participants had to completely rely on their subjective perception of the finger orientation.


1698707

  • Apparatus

    • Direct-touch tabletop surface based on FTIR.

    • A camera with a resolution of 640480 pixels and at a capture rate of 30 fps.

  • Participants

    • Eight volunteers


1698707

  • Task 1: Static Orientation Stability.

  • Task 2: Static Orientation Precision (165, 150, 135, and 120 )

  • Task 3: Dynamic Orientation Precision.

  • Task 4 : Involuntary Position Variation in Rotation.


Results1

Results

  • Algorithm is very stable.

  • The average variation range during each finger dwelling period is 0.59 (std. dev.= 0.15), in practice we can ignore finger orientation changes that are less than 1.

  • Our algorithm can provide a static precision within approximately 5. Across the complete 360orientation range, this gives 36 usable orientation levels.

  • The average dynamic orientation error (absolute value) is 14.35 (std. dev. = 9.53).

  • The average position variation during finger rotation (approximate 2.00mm )


1698707

  • For all tasks, the disambiguation algorithm generated 13 errors in total.

  • This resulted in a success rate of 96.7% (384 out of 397 trials), indicating good performance of the algorithm.


Finger orientation design impact

Finger Orientation Design Impact

  • Interactions Using Finger Orientation

    • Enhancing Target Acquisition

    • Orientation-Sensitive Widgets

  • Inferences from Orientation Information

    • Estimating Occlusion Region

    • Inferring User Position

    • Relationship between Multiple Fingers

    • Enabling Orientation-Invariant Input


Interactions using finger orientation

Interactions Using Finger Orientation

  • Enhancing Target Acquisition

    • Directed Bubble Cursor

      • applying different multiplying weights to the target distances based on the targets relative azimuth angle compared to the finger center and orientation.


Interactions using finger orientation1

Interactions Using Finger Orientation

  • Enhancing Target Acquisition

    • Aim and grab


Interactions using finger orientation2

Interactions Using Finger Orientation

  • Orientation-Sensitive Widgets


Inferences from finger orientation

Inferences From Finger Orientation

  • Estimating Occlusion Region

    • occlusion region : a circular sector opposite to the finger orientation , with the vertex at the center of the finger tip (x, y)

    • the central angle at approximately = 60


Inferences from finger orientation1

Inferences From Finger Orientation

  • Inferring User Position

    • The user sits along either one of the two long sides of the tabletop.

    • Knowing the orientation of the finger touch, we can infer that the operating user is sitting at the side opposite to the finger orientation.


Inferences from finger orientation2

Inferences From Finger Orientation

  • Relationship between Multiple Fingers

    • Exploit this information to infer the relationship between multiple fingers on the surface.

    • This location is usually to the opposite side of the directions pointed by all finger, and within a reasonable distance from the position of the fingertips


Inferences from finger orientation3

Inferences From Finger Orientation

  • we calculate the intersection point I of the two straight lines aligned with their positions and orientations.


Inferences from finger orientation4

Inferences From Finger Orientation

  • Enabling Orientation-Invariant Input

    • The orientation of the input gesture can be normalized by a compensated rotation determined by the average finger orientation while performing the gesture.


1698707

  • Multi-finger mouse emulation

    • Matejka et al. presented SDMouse

    • By considering the orientation of the index finger, we can unambiguously associate fingers to buttons located in a reachable location regardless of the users position


Algorithm limitations

Algorithm Limitations

  • Assumes an oblique touch

  • The orientation disambiguation step relies on the finger center displacement during the finger landing process and assumes that this displacement is caused solely by the deformation of the finger.

  • All fingers except thumb


Technology compatibility

Technology Compatibility

  • Variety of other sensing technologies

    • Capacity-based sensing

    • Embedded optical sensor arrays

    • FTIR-based devices


1698707

  • 1WIMP


1698707


1698707

  • 3(seamless interaction)(invisibility)(direct manipulation)


1698707

  • 4


1698707

  • Thanks for your attention.

  • !

  • 13700600260

  • QQ: 1946899

  • Skype: cnwangfeng

  • E-mail: [email protected]


  • Login