1 / 25

A Systems Architecture for Ubiquitous Video

A Systems Architecture for Ubiquitous Video. Neil J. McCurdy and William G. Griswold Mobisys, 2005 Presented by Sangjae Lee. One-line Comment. Authors address that ubiquitous video systems are essential in the future How to get to build these system? Abstractions of the infinite cameras

teo
Download Presentation

A Systems Architecture for Ubiquitous Video

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Systems Architecture for Ubiquitous Video Neil J. McCurdy and William G. Griswold Mobisys, 2005 Presented by Sangjae Lee

  2. One-line Comment • Authors address that • ubiquitous video systems are essential in the future • How to get to build these system? • Abstractions of the infinite cameras • The introduction of virtual space concept • How to adopt in ubiquitous environments?

  3. Ubiquitous Video (I) Ubiquitous Video “the walls have eyes” wireless networked video cameras It is inevitable in future • However, we do not have to wait for the future • Ubiquitous video streams using today’s technology

  4. Ubiquitous Video (II) • Entering dangerous, restricted or remote sites with head-mounted cameras • Commanders can navigate through the remote environment • Example scenarios • Police Special Weapons and Tactics (SWAT) teams • Hazardous materials (HazMat) • Police monitoring

  5. Statement of the Problem • To design ubiquitous video systems managing the incoming streams • It is challenging • wild condition • Live, real-time access, Uncalibrated cameras, Lighting conditions and etc • A naïve approach • The video on an array of monitors • Ideal solution • Infinite cameras in the field • Allow the user to move seamlessly

  6. Solution Approach • Practical Solution • Illusion of the ideal system • Operating under the constraints imposed by the real environment • RealityFlythrough • abstraction • Stitching the multiple video streams together into a single scene • Non-trivial to construct • The limited number of cameras • Mobility (position, orient)

  7. RealityFlythrough - Abstraction

  8. RealityFlythrough - The virtual cameras • Cameras project their images onto a virtual wall

  9. RealityFlythrough

  10. System overview • How might such a system be built? • We need • Cameras  image capture component • Location sensors  sensor capture component • Stream combine • Need to be combine sensor data to the appropriate frame • Multipoint Control Unit • RealFlythrough Engine

  11. RFT Engine (I) • Deciding which images to display at any point in time

  12. RFT Engine (II) • Still Image Generator • producing and managing the still-images that are generated from the live camera feeds • Transition Planner • Determining the path that will be taken to the desired destination • Choosing the images that will be displayed along that path • Transition Executer • Actually moves the user along the chosen path • Camera repository • The store for all known cameras

  13. Design pattern (environment state) • Environment stat model (virtual cameras) Open arrow : inheritance Open diamonds : a reference Filled-in diamonds : ownership

  14. Design pattern (view) • Virtual camera may need to be rendered by multiple cameras • Alpha blend

  15. The birdseye view

  16. Ubiquitous environments • Consider a typical case • The user wish to move to a live camera • A naïve approach • Determine the location and orientation of the live camera • Compute optimal trajectory to get to the target • Determine the images to be shown along the path It does not work !!!! • Ubiquitous video environment • The destination camera may change its position/orientation when the plan was computed/executed • Wrong destination • The path may not be the optimal ones

  17. Ubiquitous environments - A dynamic path • A dynamic path • The destination is now a moving target • The transition planner can look ahead some interval • Determine the best image to display at that time

  18. Still Image Generation • Key to the success of the infinite camera abstraction • The presence of sufficient cameras • To handle this problem • Take snapshot of the live video • Generate additional cameras from these • Represents still-images • Static images source • The use of still-imagery • Help achieve the abstraction of infinite camera coverage • Imprecise • Option • Never see older images • Older image look different (sepia tone)

  19. Evaluation (Effectiveness of the Abstraction) • experiment at the campus food court • Too many images were being presented • Disorientation • GPS accuracy was very low • After changing these problems • adjustments • Reduce image overload, too much movement, location accuracy filtering • A positive comments by users • Let’s try one • That was pretty nice • It’s pretty accurate • That was kind of cool

  20. Evaluation (System Performance) • Bottleneck on the server

  21. Future Modifications • Better High Level Abstraction • Sound • Scale to Multiple views with Multiple Servers

  22. Conclusion • harness ubiquitous video is designed • With few live cameras • providing the abstraction of infinite camera coverage • Virtual camera is introduced • Still-images were automatically captured • Dynamic path is used

  23. Critique (I) • Strong Points: • Their applications are very fresh • Ubiquitous Video • Authors design an whole system • The system consists of several components • Address relationship between these components • Also, they defined several problems itself and propose solution for it • Problems due to ubiquitous environment • They implemented this system and experimented in real world • It is very difficult to run system on real world.

  24. Critique (II) • Weak Points • Experimental Measurement is poor • Needs to gathering data about the comments and use statistical views • No consideration about • The abilities of mobile device • Too ideal case • Camera resolution • Computational power • The bandwidth of wireless network • RTF’s outputs are a little bit mess and dirty. • Available/unavailable image • Overlapped images • Server’s bottleneck is very serious to apply the industry

  25. Critique (III) • New Idea • A few fixed camera will be better performed • If there is some of fixed or slightly moving camera, abstraction will be better • If the number of cameras is increased, the performance increase • Scalability problems • Distributed servers • One of the problems is a bottleneck on server • Let be the server with distributed manner • a core server, gathering outputs from distributed servers • Separate the render from transition planner • Then, we should consider about the bandwidth of wireless network seriously • Video transmission on ubiquitous environment • Lower batteries, lower computation.

More Related