1 / 23

Last time we saw:

This lecture provides an overview of DC motors, gears, pulse width modulation, servo motors, and various types of sensors. It also discusses the complexity of sensors, signal processing, and levels of processing in perception. The importance of good design in perception and lessons from biological perception are explored.

jasonterry
Download Presentation

Last time we saw:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Last time we saw: • DC motors • inefficiencies, operating voltage and current, stall voltage and current and torque • current and work of a motor • Gearing gear ratios • gearing up and down • combining gears • Pulse width modulation • Servo motors

  2. Lecture Outline • What are sensors? • Types of sensors (many examples) • Sensor complexity • Signals -> symbols • Levels of processing • Poor and good design of perception • Biological perception and lessons • Sensor fusion

  3. Why is Robotics hard? • Sensors are limited and crude • Effectors are limited and crude • State (internal and external, but mostly external) is partially-observable • Environment is dynamic (changing over time) • Environment is full of potentially-useful information

  4. What are sensors? • Sensors constitute the perceptual system of a robot • Sensors do not provide state • Sensors are physical devices that measure physical quantities • Examples: • Physical property -> sensor: • contact -> switch • distance -> ultrasound, radar, infra red

  5. Examples of sensors • More examples: • Physical property -> sensor: • light level -> photo cells, cameras • sound level -> microphones • strain -> strain gauges • rotation -> encoders • magnetism -> compasses • smell -> chemical • temperature -> thermal, infra red

  6. More examples of sensors • Evenmore examples: • Physical property -> sensor: • inclination -> inclinometers • rate of change of inclination -> gyroscopes • pressure -> pressure gauges • altitude -> altimeters • and many more… • Note: the same property can be measured with different sensors

  7. Types of Sensors • Sensors range from simple to complex in the amount of information they provide • simple: an on/off switch (1 bit of input) • complex: the human retina (> 100 million photosensitive elements!) • A sensor provides “raw” information, which usually needs to be processed

  8. Sensor Complexity • The output of a simple sensor can be used directly, without processing (e.g., if switch closed, stop, else go) • The output of a complex sensor must be processed • We can ask: “Given the sensory reading I am getting, what was the world like to make the sensor give me this reading?” => reconstruction

  9. Signals -> Symbols (State) • Sensors do not provide state/symbols, just signals • A great deal of computation may be required to convert the signal from a sensor into useful state for the robot • This process bridges the areas of electronics, signal processing, and computation

  10. Levels of Processing • to find out if a switch is open or closed, we need to measure voltage going through the circuit=> electronics • using a microphone to separate voice from noise and recognize=> signal processing • using a surveillance camera, find people in the image and recognize criminals, perhaps by comparing them to a large database => computation

  11. Requirements • The more processing that needs to be done, the more computation is required • Thus perception requires: • sensors(power and electronics) • computation(more power and electronics) • connectors(to connect it all)

  12. Poor Designs of Perception • It is not a good idea to separate • what the robot senses • how it senses it • how it processes it • and how it uses it • If these are separated, the resulting robot design is typically large, bulky, and ineffective.

  13. History of Poor Designs • Historically, perception has been treated poorly: • perception in isolation • perception as “king” • perception as reconstruction • Traditionally these approaches came from computer vision, which provides the most complex data

  14. Good Designs of Perception • Instead, it is best to think about these as a single complete design: • the task the robot has to perform • the best sensors for that task • the best mechanical design that will allow the robot to get the necessary sensory information to perform that task (e.g., the body shape of the robot, the placement of the sensors, etc.)

  15. A New & Better Way • Perception in the context of action and the task • Action-oriented perception • Expectation-based perception use knowledge about the world as constraints on sensor interpretation • Focus-of-attention methods provide constraints on where to look • Perceptual classes partition the world into useful categories

  16. Biological Perception • Nature solves this problem cleverly: it evolves special sensors with special geometric and mechanical properties. • Consider facetted eyes of flies, polarized light sensors on birds, horizon/line sensors on bugs, the shape of the human ear, etc. • Biological sensors use clever mechanical designs that maximize the sensor's properties, i.e., its range and correctness.

  17. Proprioception • Origin of received sensory information divides perception into • Proprioception:sensing internal state(e.g., muscle tension, limb position) • Exteroception: sensing external state (e.g., vision, audition, smell, etc.) • Examples of proprioception • path integration (dead-reckoning) • balancing • all movement...

  18. Affordances • Affordances are “potentialities for action inherent in an object or scene” (Gibson 1979, psychology) • The focus is the interaction between the robot and its environment • Perception is biased by what needs to be done (the task) • E.g.: a chair can be something to sit in, avoid, throw, etc.

  19. Lessons from Biology • As a robot designer, you may not get the chance to make up new sensors • But you will always have the chance (and the need) to design interesting ways of using the available sensors • Utilize the interaction with the world and always keep in mind the task • Food for thought: how would you detect people in an environment?

  20. Example: detecting people • temperature: pyro-electric sensors detect special temperature ranges • movement: if everything else is static or slower/faster • color: if people wear uniquely colored clothing in your environment • shape: now you need to do complex vision processing

  21. Example: measuring distance • ultrasound sensors (sonar) give you distance directly (time of flight) • infra red provides return signal intensity • two cameras (i.e., stereo) can give you distance/depth • use perspective projection with 1 camera • use a laser and a camera, triangulate • use structured light; overlying grid patterns on the world • ...

  22. Sensor Fusion • A powerful strategy is to combine different sensors => Sensor Fusion • Sensor fusion is complex because sensors have: • different characteristics • different accuracy • different complexity • Computation is necessary to com-bine them effectively (in real-time)

  23. Biological Sensor Fusion • The brain processes information from many sensors (vision, touch, smell, hearing, sound) • The processing areas are distinct in the brain (and for vision, they are further subdivided into the “what” and “where” pathways) • Much complex processing is involved in combining the information

More Related