1 / 38

User-Centric Design of a Vision System for Interactive Applications

User-Centric Design of a Vision System for Interactive Applications. Stanislaw Borkowski, Julien Letessier, François Bérard, and James L. Crowley. ICVS’06 New York, NY, USA January 5, 2006. Academic context. PRIMA group (GRAVIR lab, INRIA)

nishi
Download Presentation

User-Centric Design of a Vision System for Interactive Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. User-Centric Designof a Vision System forInteractive Applications • Stanislaw Borkowski, Julien Letessier, François Bérard, and James L. Crowley ICVS’06 New York, NY, USA January 5, 2006

  2. Academic context • PRIMA group (GRAVIR lab, INRIA) • « Perception, Recognition and Integration for Interactive Environments » • IIHM group (CLIPS lab, Univ. Grenoble) • « Engineering in Human-Computer Interactions »

  3. Outline • Context: augmented surfaces • User-centric approach in vision systems • User-centric requirements • Implementation • SPODs, VEIL, Support services • Conclusions & Future Work

  4. Context : augmented surfaces • Interacting withprojected images... • direct manipulation • user collaboration • mobility • ... is not realistic today • limited, controlled conditions • operator requirement • software integration issues credit: F. Bérard, J. Letessier

  5. Objectives • Propose a client-centric approach • design of perceptive input systems • two classes of clients : • end users realize an interaction task • developers create an interactive application • Application : design an input system • address simple augmented surfaces • feature vision-based, WIMP-like widgets(e.g. press-buttons) • acheive the usability of a physical input device

  6. Approach Overview • Top-down design • Determine client requirements • consequences of HCI and SOA requirements • user-centric / developer-centric • functional / non-functional • Service-oriented • def : a service adds value to information • SOA is a collection of communicating services

  7. Developer requirements • Abstraction : be relevant • make computer vision invisible • generalize the input • Isolation : allow integration • permit service distribution • support remote access to services • offer code reuse • Contract : offer quality of service • specify usage conditions • determine service latency, precision, etc.

  8. End-user requirements • Typical for "real time" interaction • Latency limits • upper bound : 50 ms for coupled interaction • lower bound : 1 s for monitoring applications • Autonomy • ideally, no setup or maintenance • in practice, minimize task disruption • Reliability / predictability • either real-time or unusable • reproducible user experience

  9. Pragmatic approach • Black-box services • BIP(Basic Interconnection Protocol) • BIP implementation ≈ SOA middleware • service/service and service/application comm. • goal 1 : performance • connection-oriented (TCP-based) • low latency (UDP extensions) • goal 2 : easy integration • service discovery (standards-based) • implementations provided (C++, Java, Tcl) • interoperability ≤ 100 lines of code

  10. Our approach • Abstraction, Isolation : use BIP • advice to service developers • Contract : nothing enforced • recommend evaluation of hci-centric criteria • Common ground • allows to create SOA-based prototypes

  11. Interactive widgets projected on a portable display surface

  12. Luminance-based button widget S. Borkowski, J. Letessier, and J. L. Crowley. Spatial Control of Interactive Surfaces in anAugmented Environment. In Proceedings of the EHCI’04. Springer, 2004.

  13. Touch detection • Locate widget in the camera image • Calculate mean luminance over the widget • Update the state widget state

  14. Robustness to clutter

  15. Robustness to clutter

  16. Assembling occlusion detectors

  17. Assembling occlusion detectors

  18. Gain x Striplet – the occlusion detector x y

  19. Striplet – the occlusion detector x y

  20. Striplet – the occlusion detector x y

  21. Striplet-based SPOD SPOD – Simple-Pattern Occlusion Detector

  22. Striplet-based button

  23. Striplet-based slider

  24. VEIL S P O D Striplets Engine SPOD software components GUI rendering Client Application Calibration Camera GUI

  25. VEIL – Vision Events Interpretation Layer Inputs • Widgets coordinates • Scale and UI to camera mapping matrix • Striplets occlusion events Outputs • Interaction events • Striplets coordinates VEIL S P O D Striplets Engine

  26. VEIL – Vision Events Interpretation Layer Inputs • Widgets coordinates • Scale and UI to camera mapping matrix • Striplets occlusion events Outputs • Interaction events • Striplets coordinates VEIL S P O D Striplets Engine

  27. VEIL – Vision Events Interpretation Layer Inputs • Widgets coordinates • Scale and UI to camera mapping matrix • Striplets occlusion events Outputs • Interaction events • Striplets coordinates VEIL S P O D Striplets Engine

  28. Striplets Engine Service Inputs • Striplets UI-coordinates • UI to camera mapping matrix • Images from camera service Outputs • Occlusion events VEIL S P O D Striplets Engine

  29. Striplets Engine Service Inputs • Striplets UI-coordinates • UI to camera mapping matrix • Images from camera service Outputs • Occlusion events VEIL S P O D Striplets Engine

  30. Striplets Engine Service Inputs • Striplets UI-coordinates • UI to camera mapping matrix • Images from camera service Outputs • Occlusion events VEIL S P O D Striplets Engine

  31. VEIL – Vision Events Interpretation Layer Inputs • Widgets coordinates • Scale and UI to camera mapping matrix • Striplets occlusion events Outputs • Interaction events • Striplets coordinates VEIL S P O D Striplets Engine

  32. SPOD-based calculator Video available at: http://www-prima.inrialpes.fr

  33. Conclusions • We have presented • Service-oriented approach • Implementation • Future work • Different detector types • More intelligent VEIL • Integration to GML

  34. Thank you for your attention

  35. Assembling occlusion detectors

  36. Striplet – the occlusion detector

  37. Striplet – the occlusion detector

  38. Striplet – the occlusion detector

More Related