1 / 45

SENSORS & MOBILE MUSIC Lalya Gaye

SENSORS & MOBILE MUSIC Lalya Gaye. Music controllers Interfaces. * Body-based Human body as start for design: Expressive qualities of human movements. The Han ds, Waisvicz, STEIM, 1984. Music controllers Interaction. * User movement - Choreographed body movement

glen
Download Presentation

SENSORS & MOBILE MUSIC Lalya Gaye

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SENSORS & MOBILE MUSIC Lalya Gaye

  2. Music controllers Interfaces * Body-based Human body as start for design: Expressive qualities of human movements The Hands, Waisvicz, STEIM, 1984

  3. Music controllers Interaction * User movement - Choreographed body movement - Traditional instrumental gesture - Novel gestures Dark around the Edges, Winkler, 1997 Machover & Yoyo Ma, Hypercello, 1991 The Hands, Waisvicz, STEIM, 1984

  4. Music controllers Interaction • * User movement • Full-handed gesture • - Empty-handed gesture Unfoldings, Interactive Inst., 2003 Stranglophone, Sharon, ITP/NYU, 03 Lady glove, Bongers & Sonami, 1991

  5. Music controllers Interfaces • * Environment-based • Interactive environments • - Reactive floors • Digital realm: networked audio • Take advantage of the features of space • Interactive environments: • many people together, • control of interaction parameters… Global String, Tanaka & Toeplitz, 1998 Magic Carpet, MIT Medialab, 1996

  6. Music controllers Interfaces * Wearables Musical jeans jacket (MIT Medialab, 1992) Tgarden (FoAM & sponge, ~2001) Expressive Footwear (MIT, 1997-2000) ensemble (Kristina Andersen, ~2003) Intimate interfaces; Body movement and posture Theatrical vs. daily life dimensions

  7. Music controllers Interfaces • * Object-based • Starting with existing instruments • - augmented (hyperinstruments…) • - digitalised (ex: piano synth) • interface used as controller (ex: MIDI keyboard) • Use metaphor of object Taku Lippit, ITP/NYU, 2002-03 Machover & Ma, Hypercello, MIT, 1991

  8. Music controllers Interfaces * Object-based Repurposed everyday objects and materials: water, fabric, chemicals, vegetables … Daniel Skoglund, 8Tunnel2 Particles, Horio Kanta, 2003 MIDI Scrapyard Challenge, Brucker-Cohen & Moriwaki, 03-04

  9. Music controllers Interfaces * Object-based Take advantage of the material properties of objects f.e.x bendable, conducts electricity, etc Take into consideration human activities surrounding the objects: build upon it and / or break from it

  10. Music controllers Output * Mechanical Guitarbot (Eric Singer et al., LEMUR, 2003-) * Electroacoustic Spherical speakers (Curtis Bahn) * Tactile output (haptics) Cutaneous Grooves (E. Gunther, MIT Medialab, 2001)

  11. Sensors in Ubicomp technology * Computing where needed, not other way around. Invisible in use, in the fabric of everyday life, embodied interaction.Connection to place and moment of use. * Sensors: - in everyday environments (e.g. context-awareness) - on people (e.g. wearables) - on artefacts (Media cup - TecO) * Sensor fusion: combining different data and placements to gather context - sensor networks

  12. Sensors in mobile music & locative audio * Combining NIME and Ubicomp type of sensors use * Urban settings + everyday: rich environment, familiar, unpredictable, dynamic, heterogeneous * Sensors on environments, users, objects * Interaction between: - user and objects - user and environment - user and user(s) + combinations and networks Possible uses, interactions, issues and implications of implementations?

  13. Mobile music and locative audio Locative audio in public space * Space annotation: sensing proximity / location Hear&There (Rozier, MIT Medialab, 1999) Tejp / Audio tags (PLAY & FAL, 2003-04)

  14. Mobile music and locative audio Locative audio in public space * Radio pirates: sensing environmental factors Bit Radio (Bureau of Inverse Technology)

  15. Mobile music and locative audio Mobile music * Mobile music sharing: sensing others SoundPryer (Mattias Östergren, Interactive Institute, 2001) TunA (Arianna Bassoli et al., Medialab Europe, 2002) Push!Music(Håkansson et al., Viktoria Institute, 2005)

  16. Mobile music and locative audio Mobile music * Mobile music making Music making away from computer screen or performance setting: in the everyday Sensor technology + GPS -> situated music making Ad hoc & distributed networks throughout the city -> collaborative music making etc

  17. Mobile music and locative audio Mobile music * Mobile music making: sensing user-environment interaction Sonic City (Gaye et al., FAL & PLAY, 2002-04) Sound Lens (Toshio Iwai, 200?)

  18. Mobile music and locative audio Mobile music * Mobile music making: device as interface between user and space Sound Mapping (Iain Mott et al., Reverberant, 1998)

  19. Mobile music and locative audio Mobile music * Mobile music making: sensing user-user + user-device interaction CosTune (Nishimoto, ATR, 2001) Sound Lens(Toshio Iwai, 200?) Malleable Mobile Music (Atau Tanaka, Sony CSL, 2004)

  20. Mobile music and locative audio Sound Walks: mapping audio world to physical paths * Sound-art installations Electric walks (Christina Kubisch) Drift (Teri Rueb) * Walking through digital space Seven Mile Boots (Beloff et al., 2003-04)

  21. Mobile and locative sound Wearable audio Personal instrument (Krzysztof Wodiczko, 1969)

  22. Mobile and locative sound Output Headphones vs Boombox vs Using everyday objects SoundbugTM speakers & piezos Flower Speakers (LET’S corporation, Japan, 2004)

  23. Mobile and locative sound Output Wearables Nomadic Radio (Nitin Shawney, MIT Medialab, 1998) Sonic Fabric (Alice Santaro, 2002)

  24. Demo DIY music controller * System set-up Tracking & other sensors Micro-controllers MIDI protocol Interactive softwares

  25. DIY music controller * Components - sensors: potentiometer + switch / light + proximity sensors - micro-controller: BasicX-24 - protocol: MIDI - software: Pd

  26. Tracking & other sensors * Contact-based tracking Isometric • Pressure, switches, etc Movement sensing • Rotation: pots, goniometers, joysticks • Linear movement: sliders, tension sensors, pads, tablets • Bending Ref: “Human Movement Tracking Technology”, Mulder, A. Technical Report, NSERC Hand Centered Studies of Human Movement project. Burnaby, B.C., Canada: Simon Fraser University.

  27. Tracking & other sensors * Contact-based tracking Inside-in • Emitter + receiver on subject  body-centred • Workspace in principle unlimited • ex: flex sensors, biometric sensors… Inside-out • Sensor on subject + external emitter • Workspace limited if source artificial, unlimited if source natural • ex: accelerometers, gyroscope, compass… Ref: “Human Movement Tracking Technology”, Mulder, A. Technical Report, NSERC Hand Centered Studies of Human Movement project. Burnaby, B.C., Canada: Simon Fraser University.

  28. Tracking & other sensors * Contactless tracking Outside-in • External sensor + emitter on subject • Least obtrusive • Workspace limited • ex: video tracking + markers Indirect acquisition • Deduction from audio output • Latency Ref: “Human Movement Tracking Technology”, Mulder, A. Technical Report, NSERC Hand Centered Studies of Human Movement project. Burnaby, B.C., Canada: Simon Fraser University.

  29. Tracking & other sensors * Other sensors Objects • More or less same as human tracking sensors Environment • Light, sound, temperature, humidity, electricity, magnetism… Digital information • ex: activity on internet Ref: “Human Movement Tracking Technology”, Mulder, A. Technical Report, NSERC Hand Centered Studies of Human Movement project. Burnaby, B.C., Canada: Simon Fraser University.

  30. * Micro-controllers • Collecting sensor data and sending them to processor (e.g. PC) as serial data (e.g. MIDI signal) • Can also be used to trigger actuators (f. ex: LED) Common micro-controllers • BasicX-24 • Basic Stamp II • PIC

  31. * MIDI protocol • MIDI=Musical Instrument Digital Interface • Standardised serial communications protocol between synthesizers and other digital music devices • Controllers / receivers • Midi command = status byte + 2 data bytes • action (note on, note off, pitch bend, control change ) • pitch • velocity (how loud)

  32. * Interactive music softwares Common softwares • MAX/MSP • Pd (Pure Data)… • Using MIDI signals as control data…

  33. * Reading sensor values with BX-24 • connect sensor to ADC pins • power supply them with the BX’s 5V DC output power • (! BX power = 9V) • add ”SerialPort” module for communicating with serial port • write routine for reading voltage on pins • download program to EPPROM

  34. Option Explicit Dim voltIn As Byte Dim switch As Byte Public Sub Main() voltIn = 1 switch = 1 Do 'potentiometer voltIn = cByte(getADC(16)) 'switch switch = GetPin(17) Debug.Print "voltIn:"; cStr(VoltIn) Debug.Print "switch:"; cStr(switch) Call Sleep(0.05) Loop End Sub

  35. * Sending values as MIDI signal - convert data into MIDI scale (0-127) - create buffer - adapt baud rate to MIDI speed - write subroutine loop for sending MIDI - MIDI command 144 (note on) + 128 (note off) - or on + ”velocity” used as ID +”pitch” used as sensor value - download on EPPROM - sending out serial data via MIDI adapter circuit and MIDI-USB adapter Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/code/archives/bx-24/000249.shtml

  36. Option Explicit Dim InputBuffer(1 To 12) As Byte Dim OutputBuffer(1 To 10) As Byte Dim midiCmd As Byte Dim vel As Byte Dim midiTaskVar(1 To 50) As Byte Dim voltIn As Byte Dim switch As Byte Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/code/archives/bx-24/000249.shtml

  37. Public Sub Main() voltIn = 1 switch = 1 Call openQueue(Inputbuffer, 12) Call openQueue(Outputbuffer, 10) Call OpenCom(1, 9600, InputBuffer, Outputbuffer) Register.ubrr = 14 midiCmd = 144 CallTask "midiTask", midiTaskVar Do 'potentiometer voltIn = cByte(cSng(getADC(16)) * 127.0 / 1023.0) 'switch switch = GetPin(17) Call Sleep(0.05) Loop End Sub Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/code/archives/bx-24/000249.shtml

  38. Sub midiTask () Do vel=1 Call putQueue(OutputBuffer, midiCmd, 1) Call putQueue(OutputBuffer, voltIn, 1) Call putQueue(OutputBuffer, vel, 1) vel=2 Call putQueue(OutputBuffer, midiCmd, 1) Call putQueue(OutputBuffer, switch, 1) Call putQueue(OutputBuffer, vel, 1) Call Sleep(0.05) Loop End Sub Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/code/archives/bx-24/000249.shtml

  39. * Sending values as MIDI signal Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/midi.shtml

  40. * Receiving MIDI data in Pd • C:…/pd/bin • pd –midiindev 1 • route data according to ID (”vel”) • use ”pitch” as control values

  41. Discussion * Mobile music application using sensors:Possible uses, interactions, issues and implications of implementations? * Props: sensor platform, soundbug, tell me * Focus: - sensor positioning- physical interaction and relation between sound, body and place- combining data

  42. Links • DIY links • BX-24: http://www.basicx.com • Pd: http://www.crca.ucsd.edu/~msp/software.html • More micro-controllers etc: ITP Physical computing • http://tigoe.net/pcomp/index.shtml • Book Physical Computing – Dan Sullivan & Tom Igoe • On iPaq: Linux + PDa (by Gunther Geiger): • http://gige.xdv.org/pda/

  43. Links Sensors & Mobile Music Links New Interfaces for Musical Expression: http://www.nime.org Mobile Music & Locative Audio: http://www.netzwissenschaft.de/mob.htm http://www.viktoria.se/~lalya/tamabi05/ Ubiquitous Computing: http://www.ubicomp.org/

More Related