1 / 23

An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer

An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer. John Thompson (Music) Mary Li (ECE) Michael Quinn (ECE). Goal.

rodney
Download Presentation

An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE) Michael Quinn (ECE) University of California, Santa Barbara

  2. Goal • Develop an interactive music synthesis system while exploring tracking and surveillance technologies, spatial music composition strategies, and sound synthesis techniques University of California, Santa Barbara

  3. Hardware & Software • Unibrain Fire-I Cameras • PC Running Windows XP • Apple Powerbook and G5 • Intel’s OpenCV Libraries • Max/MSP/Jitter • SuperCollider University of California, Santa Barbara

  4. Project Summary • 2D Tracking • Camera Calibration • 3D Position Calculation • Composition and Sound Synthesis University of California, Santa Barbara

  5. 2D Tracking – Temporal Difference • Temporal Difference • Subtract previous frame from the current frame to see what has changed. University of California, Santa Barbara

  6. 2D Tracking - Background Subtraction • Develop a background model • Subtract background from current frame • Objects not in model will show up in the difference University of California, Santa Barbara

  7. 2D Tracking - Thresholding • Values chosen based on variance of background model University of California, Santa Barbara

  8. 2D Tracking – Center of Mass • For now, we assume that only one object is being tracked. Thus, the image center of mass approximates the object center of mass. • Center of Mass is then sent to the 3D section. University of California, Santa Barbara

  9. Camera Calibration • Purpose • A preparation for 3D estimation from 2D images • Methods • Matlab camera calibration toolbox • Intel OpenCV calibration functions University of California, Santa Barbara

  10. Camera Calibration -- intrinsic parameters Focal lengths: fx, fy Principal points: px, py Distortions: radial and tangential distortion coefficients DirectShow Filter runs under MS Windows University of California, Santa Barbara

  11. Camera Calibration -- intrinsic parameters Defines pixel coordinate points with respect to camera coordinate system Ximage = Mintr Xcamera Matlab Camera Calibration Toolbox University of California, Santa Barbara

  12. left camera view center camera view right camera view Camera Calibration -- extrinsic parameters Defines camera coordinate points with respect to world coordinate system Xcamera = Mextr Xworld OpenCV calibration routine (based on intrinsic parameters) University of California, Santa Barbara

  13. 3D Tracking -- methods • Obtain 2D motion centroid information • Epipolar Geometry • Least Square University of California, Santa Barbara

  14. 3D Tracking -- results X_world Y_world Z_world floor plan of visible space 18-pt tracking example University of California, Santa Barbara

  15. Tracking System Performance • Realtime average 2.09 frames per second • System performance can be improved by • Distributed computing: one PC for each camera • More cameras • Improve background segmentation University of California, Santa Barbara

  16. TransMedia Systems Trans-media systems exist as independent engines behind artistic manifestations in diverse media. Input: In the case of our Motion Tracking System project, the implementation of a Motion Tracking algorithms tracking objects within a sensor space serves as a principle component to power the trans-media system. Transformation: In the middle layer, the data from the Motion Tracking system is interpreted and labled. This data is then used to determine the activity and state of the sensor space. Output: In the final stage of the trans-media system, specific media, such as sound, use the middle layer data to inform their processes. The sound is projected into the sensor space. Interactivity is enhanced when the participants in the sensor space become aware of their relationship with the system. Graphic Notations and Trans-media Systems: John Cage “Fontana Mix” University of California, Santa Barbara

  17. Spatial Composition Strategies -- Sonic Nodes • A system of nodes are layed out in the virtual space. The system of nodes is comprised of Generative Nodes and Transformative Nodes • The nodes have an activation space surrounding them. Tracked objects activate nodes at various levels depending on the tracked objects’ measured distance from the nodes’ center. • The nodes are in flux and adjust their positions over time to reflect the history of the space University of California, Santa Barbara

  18. When a tracked object moves within the activation space of a particular node, the node executes its action. Figure 1 Nodes with various musical functions are represented by colored circles. Different paths create unique realizations of phrase level material in the mobile form University of California, Santa Barbara

  19. Pitch sets in the chord nodes [0 1 3 4] [0 1 5 7] [0 2 3 7] [0 2 4 6] [0 2 4 7] [0 2 4 8] [0 2 5 6] [0 2 5 7] [0 2 5 8] [0 2 6 7] [0 2 6 8] [0 3 5 6] [0 3 5 7] [0 4 5 8] Thirty-four chordNodes are scattered in the virtual space. Seventeen of the chordNodes contain a unique four note pitch set. Fourteen of the Seventeen sets are unique in their normal order.Although the pitch sets are diverse, they are closely knitted in their makeup. This lends a unified quality to the pitched verticalities of the sonic space. As multiple users move throughout the space, the sonic material subtlely shifts, melding the space into a cohesive flow. Tracked objects leave histories of their paths in the space, the pitchSets transform in response. The space adapts its pitched contents to the actions of the users of the space. University of California, Santa Barbara

  20. Sound Spatialization Speakers The system outputs quadraphonic audio distributed to speakers surrounding the sensor space. The position of the sounds within the sensor space is determined by the position of the tracked object. (Figure 1) Sensor Space Distance is simulated through direct sound to reverberant sound mixture. This ratio is dictated by the following formulas: Direct Sound Amplitude1/zPosition.abs Reverb Sound Amplitude = 1/zPosition.abs.sqrt Object Position Figure 1 University of California, Santa Barbara

  21. Future Work • Improve the system by enabling the tracking of multiple objects as well as incorporating features such as shape, size, and color. • Improve integration with the musical synthesis system. University of California, Santa Barbara

  22. Special Thanks • Professor B.S. Manjunath • Professor G. Legrady • Professor J. Kuchera-Morin • NSF IGERT Program • Fellow IGERTers University of California, Santa Barbara

  23. Q ?? University of California, Santa Barbara

More Related