100 likes | 231 Views
The MIT Media Lab's Responsive Environments Group focuses on advancing novel sensor technologies aimed at creating smaller, cheaper, or denser systems than currently available. We explore various applications including inertial measurement systems, low-cost RF tagging for real-time tracking, and smart roadway sensor networks. Our compact devices, such as expressive footwear embedded with multi-modal sensors, bridge motion and interaction seamlessly. Our research emphasizes intuitive interfaces and smart capabilities, fostering environments where technology actively responds to human movement and feedback in innovative ways.
E N D
Wireless Transduction TTT Lab Discussion Responsive Environments Group MIT Media Lab
Responsive Environments Group • Work on novel sensors, either: • smaller • cheaper • or denser then currently available. • Examples: • Scanning range finder for large-scale interface • Low-cost realtime RF tagging systems
Overview • Ari Benbasat • Inertial measurement based systems • Ara Knaian • Sensor networks for smart roadway • Ari Adler • Telemedicine for developing countries • Zoe Teegarden • Low power RF chipsets
Expressive Footwear • 16 sensors • Inertial: gyro, accelerometer,compass, crash sensor • Header for pressure, bend, etc. sensors • Sonar receiver • Electric field sensor • Basestaion • Sonar system • Data receive and interpret • PAN
Suggested Uses: • Allow dancer’s motion to create music rather than conform to it • Presentations at Wearables shows, Tokyo Toy Fair and Sens*bles. • Sports medicine • Embed in other projects • e.g. chicken, voodoo doll
3X Compact Inertial Measurement Unit • Builds from shoe work • Goal was to have sensing set for wireless 3D in compact package. • Implementation • 3(+1) Accelerometers • 3 Gyroscopes • ADC/Microcontroller • PAN electrode for ID, load detect
System Block Diagram • For system used at SIGGRAPH ‘99: • Used in Forks & Buns interface: • Transparent • Intuitive
Current Work • Create analysis and interpretation framework for such devices: • Avoid ad hoc methods • Possible script-based IMU application creation • Research into feedback modes: • Want on-board feedback for closest coupling of action and response • Limited to LEDs and simple sounds (?) • Goal: Juggling balls that teach juggling
Smart Sensors • Not wireless but unwired • All processing and feedback on-board • Allow devices to know and understand their own movement to extent (in any situation). • Intuitive Interfaces • Completely transparent • But, must respond as user, not designer, expects