motion in sound designing sound for interactive dance performance
Download
Skip this Video
Download Presentation
Motion in Sound: Designing Sound for Interactive Dance Performance

Loading in 2 Seconds...

play fullscreen
1 / 28

Presentation File - PowerPoint PPT Presentation


  • 239 Views
  • Uploaded on

Motion in Sound: Designing Sound for Interactive Dance Performance. Dr. Dan Hosken Associate Professor of Music California State University, Northridge. Presented at: ATMI 2006 San Antonio, TX September 16, 2006. Purpose.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Presentation File' - Gideon


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
motion in sound designing sound for interactive dance performance

Motion in Sound: Designing Sound for Interactive Dance Performance

Dr. Dan Hosken

Associate Professor of Music

California State University, Northridge

Presented at:

ATMI 2006

San Antonio, TX

September 16, 2006

purpose
Purpose
  • Present a somewhat simplified and useful approach to creating for the interactive dance medium
  • Facilitate collaboration between students of dance and students of music
objectives
Objectives:
  • Give an overview of the hardware and software components of a camera-based interactive dance/music system
  • Present a loose taxonomy of motion parameters and mapping types
  • Suggest some useful mappings between motion parameters and sound element parameters
  • Illustrate these mappings using examples of my recent work with the Palindrome IMPG
general system overview
General System Overview
  • Camera trained on dancer(s) is connected to computer
  • Video Analysis Software abstracts motion data in realtime
  • Motion Data are passed to Sound Software
  • Sound Software maps incoming motion data to sound element parameters in realtime
overview w bad clipart
Overview w/ bad clipart

video computer

audio computer

ethernet

sound generation software
Sound Generation Software
  • Max/MSP (Cycling ‘74)
  • PD (Miller Puckette)—free!
  • Supercollider (J. McCartney)—free!
  • Reaktor (Native Instruments)
  • …and any software that can receive data and produce sound in realtime
video analysis software
Video Analysis Software
  • EyeCon (Frieder Weiss)
  • EyesWeb (eyesweb.org)—free!
  • Jitter (Cycling ‘74)
    • SoftVNS (David Rokeby)
    • Cyclops (Eric Singer/Cycling ‘74)
    • TapTools (Electrotap)
    • cv.jit (Jean-Marc Pelletier)
    • Eyes (Rob Lovel)—free!
objectives redux
Objectives (redux):
  • Give an overview of the hardware and software components of a camera-based interactive dance/music system
  • Present a loose taxonomy of motion parameters and mapping types
  • Suggest some useful mappings between motion parameters and sound element parameters
  • Illustrate these mappings using examples of my recent work with the Palindrome IMPG
definitions 1
Definitions (1)
  • Motion Parameter: made up of specified data abstracted from part or all of video, e.g.,
    • Height
    • Width
    • Dynamic
  • Sound Element: a distinct, coherent sonic behavior created by one or more synthesis or processing techniques, e.g.,
    • A Low Drone created by FM Synthesis
    • Time-stretched text created by Granulation
    • Percussive patterns created by Sample Playback
definitions 2
Definitions (2)
  • Sound Element Parameter: a parameter of a synthesis/processing technique, e.g.,
    • Modulation Frequency of a simple FM pair
    • Grain Size of a granulated sound file
    • Ir/regularity of tempo in a rhythmic pattern
  • Mapping: the connection between a motion parameter and a sound element parameter, e.g.,
    • Heightmodulation frequency of FM
    • Widthgrain size of granulated sound file
    • DynamicIrregularity of tempo
definitions 3
Definitions (3)
  • Scene: a group of motion parameters, sound elements, and mappings between them
eyecon interface 1
EyeCon Interface (1)

Field: can measure height or width or dynamic or…

Touchlines: detect crossing and position on line

eyecon interface 2
EyeCon Interface (2)

Fields and lines are mapped to MIDI data (or OSC)

Sequencer steps through “scenes”

taxonomy of motion parameters
Taxonomy of Motion Parameters
  • Body Parameters: position independent, “attached” to body
    • Height
    • Width
    • Dynamic
  • Stage Parameters: position dependent, “attached” to stage
    • Left-right position
    • Touchlines
    • Extremely Narrow Fields
parameter type examples
Parameter Type Examples
  • Stage Parameter (position): Scene 3 from Brother-Sister Solo
    • Julia Eisele, dancer/choregrapher
    • Stuttgart, June 2005
  • Body Parameter (Dynamic): Conversation
    • Robert Wechsler, dancer/choreographer
    • Julia Eisele, dancer
    • Stuttgart, June 2005
primary secondary mappings
Primary/Secondary Mappings
  • Primary Mapping: controls dominant sonic feature
  • Secondary Mapping: …is secondary…
  • Example: Scene 3 from Brother-Sister Solo
    • Primary mapping: positionposition in sound “landscape”
    • Secondary mapping: dynamicdisturbance of drone
    • Secondary mapping: widthloop size/speed of segment within sound file
sound element mappings 1
Sound Element mappings (1)
  • A Human Conversation (in progress)
    • Scene 7-8:
      • DynamicGranulated Text (playback rate)
    • Scene 9:
      • Dynamic (left)Granulated Text (playback rate)
      • Dynamic (right)Granulated Text (playback rate)
a human conversation
A Human Conversation
  • Robert Wechsler (Palindrome), choreographer/dancer
  • J’aime Morrison (CSUN), choreographer/dancer
  • Dan Hosken, composer and sound programmer
  • Work session, CSUN, June 23, 2006
sound element mappings 2
Sound Element mappings (2)
  • Perceivable Bodies (Emily Fernandez)
    • Scene 3a:
      • PositionGranulated Text (position in file) [Primary]
      • WidthGranulated Text (grain duration)
      • DynamicLow FM Drone (mod frequency)
    • Scene 3b:
      • PositionPhase Voc File (position in file) [Primary]
      • WidthPhase Voc file (loop length/rate)
      • DynamicLow FM Drone (mod frequency)
    • Scene 4:
      • DynamicGranulated Noise (density) [Primary]
      • DynamicGranulated Noise (position in file)
perceivable bodies
Perceivable Bodies
  • Emily Fernandez, choreographer/dancer
  • Frieder Weiss, projections and interactive programming
  • Dan Hosken, composer and sound programmer
  • World Premiere at Connecticut College, April 1, 2006
slide21
[email protected]

Examples shown can be found:

http://www.csun.edu/~dwh50750/Papers-Presentations/

Full Pieces can be found:

http://www.csun.edu/~dwh50750/Music/

Other Examples of Palindrome’s work:

http://www.palindrome.de/

ad