1 / 27

Client Server methods for Future Cinema Compositing Multiple 4K Cinema Streams

Client Server methods for Future Cinema Compositing Multiple 4K Cinema Streams. Scalable City: Massive Scale Virtual Worlds. Massively multiplayer continuous world Hundreds of thousands of interactive objects Large aggregate bandwidth requirements. Challenges/Issues

bryony
Download Presentation

Client Server methods for Future Cinema Compositing Multiple 4K Cinema Streams

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Client Server methods for Future Cinema Compositing Multiple 4K Cinema Streams

  2. Scalable City: Massive Scale Virtual Worlds • Massively multiplayer continuous world • Hundreds of thousands of interactive objects • Large aggregate bandwidth requirements • Challenges/Issues • Originally develop multicore accelerated single user, immersive system, to complex virtual world. Create compute and conceptual ecosystem.

  3. The environment can be extensible with endless variations, computed at run time.

  4. Development Server Framework 2/2011 • 8HS22 blades • - 16 Xeons • – 96 cores • 4 QS20 blades • 8 Cell CPU’s • 2 QS22 blades • 4 Cell CPU’s • 4 way Xeon Server • – 32 core nVidia Tesla accelerator – 4 GPU’s on linux host 3 10gb interfaces to compute accelerators Many Clients IBM Z10 mainframe computer at San Diego Supercomputer Center 2- IFL’s with 128mb Ram, zVM virtual OS manager with Linux guests 6 tb storage fast local storage – 15K disks 4 SR and 2 LR 10gb ethernet interfaces 10gb interface to internet

  5. Future Cinema – Virtual World Director Sheldon Brown, UCSD Site Director CHMPR Kristen Kho, Programmer, UCSD Experimental Game Lab

  6. Virtual World Director Allows Virtual World to be platform for creating Cinema. Enables very flexible methods for creating cinema. Allows cinematic production to take place in a broadly distributed and many user environment. Allows cinematic data to be derived from the experiences of many users.

  7. Interface – Machinima Menu

  8. Interface - Editing

  9. 4K Cinema Generation • Automate re-rendering of data into 3D 4K frames. • Sony ZEGO BCU-100 Computing Unit • Cell/B.E. Processor and RSX technology • Prototype system with 4 HD-SDI inputs/outputs • Use to process real-time 4K video • Procedural 4K animations – (ala Galileo films) • Initial experiments to use BCU-100 as a 4K video player: • Not fast enough to load 4K frames from disk and maintain a reasonable frame rate. • Must process input and immediately send to output. • Create procedural 4K movies from data sources – i.e. bitmap compositing, algorithms, etc.

  10. Ultra high resolution projection technologies 4K – 3840 x 2160 pixel displays today 8K and beyond tomorrow Digital Cinema Auditorium ultra-high resolution cinema, 3-D sound; tele-presence conferencing; stereo imaging, Cine-Grid optical ultra-high bandwidth Digital Cinema –Galileo Movies, 24 short 4K Movies – Sheldon Brown, Kim Stanley-Robinson

  11. Leveraging global lambda networks to improve performance for extremely high bandwidth and time dependent applications such as cinema, teleconferencing and virtual reality.

  12. Stream Ultra High Fidelity Cinema on Global Lambda Grid Collaboration with University of Sao Paulo, Mackenzie University, Keio University, Sony, Zaxel, IBM, Intel

  13. Stream real-time uncompressed 4K to our compositing and annotation server • Extending TritonSort* system • Hybrid Disk/Flash Memory Storage Array • 100’s of Terabytes • Multiple 10gb interfaces • *George Porter Center for Networked Systems, UCSD

  14. Data-intensive Cinema • Data-intensive Cinema (DI-Cinema) Project • Encompasses video, scientific visualization, motion picture production, medical imagery, … • Higher resolution formats + 3D = big data challenge • 100s of TB to multiple PBs of data per project • Not just streaming • Subjected to increasing amount of computation • Media is the result of staged, pipeline-oriented computing

  15. Digital Cinema as a Computing Problem Batch-oriented video computing: • Color adjustment • Digital watermarks • Compositing (join operation) • Format, resolution, frame-rate adjustments • Rendering • Captioning • Transcoding Requires high resolution, and thus large data sizes, at each of these stages as well as in the final product I/O I/O I/O Compute Compute Compute Compute Compute I/O I/O I/O Streaming

  16. Data-intensive video is a problem today… • Netflix transcodes each of its movies 50+ times for each type of end-user device and resolution, relying on the cloud for scalability (Adrian Cockcroft, CTO Netflix) • 4+ million “assets,” or pieces of content, must be orchestrated together to form a single project (“What’s up Prof? Current Issues in the Visual Effects & Post-Production Industry”, Leonardo, vol 43, no 1, Feb 2010). • On digital intermediate:“…the files are so massive they are difficult to display, edit or color correct”, “Real time playback…is extremely difficult on even the fastest RAIDs”, “This is a lot of data to push around…requiring proprietary software and hardware solutions” (Filmlook, Inc., 2008)

  17. …and a bigger problem tomorrow * Image used with permission (GNU Free Documentation License)

  18. 4K/Stereo High Resolution Cinema development with same asset pipeline as game environment

  19. Calit2 and NSF Optiputer Project Allow for High Speed Networks and Ultra-High Fidelity environments. Calit2, UCSD WAAG, Amsterdam

  20. Potential DI-Cinema system architecture • Supporting efficient computation is focus • Supporting low-latency streaming • Supporting efficient integration with networked storage Themis Streaming Support Software remote display emulator NAS/SAN Long-term Storage v Hardware remote display w Multi-stage, iterative computation u

  21. 2. Storage for batch and low-latency Storage networking challenges • Giant gather/scatter problem • Read from ~100 disks and aggregate in-network at 60 fps (100+ MB/frame) • Disk seek latency of 15ms, yet new frames must be loaded within 16 ms

  22. Potential New Industrial Connections • Technicolor: • Major provider of motion picture toolchain • Acquiring content, video storage, computing, viewing • Disney • Major source of content • Large computing infrastructure; large pipeline-oriented computing platform • Cisco • Datacenter networking • HP • Provider of computation and storage to motion picture industry • Carries out data-intensive computing research • Google • World’s largest video processing system • YouTube • Just announced support for 4K video

  23. Next Steps • Resolve server/compute system infrastructure • Develop client interface mixing virtual world 4K and 4K camera based • Integrate streaming storage system with server/compute

More Related