Capstone Spring 2009 Preliminary Design Review. Cole Bendixen Electrical and Computer Engineering Erich Hanke Electrical Engineering. Erik Larson Electrical Engineering Quang Than Electrical Engineering. HAMSter. HAMSter Project Overview. Mobile Servo Powered Cart
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Electrical and Computer Engineering
To create an autonomous platform that can be used as a mounting point for various sensors and monitors to test remote locations without human control.
To advance hardware stereo-vision algorithms.
Sensor Interrupt Levels
IR Proximity Sensor
Frame Ready To Processes
RF Beacon Detector
Green = Hardware Implementation
Blue = Software Implementation
Yellow = Hardware/Software interface
HDL IMAGE CO-PROCESSOR
PPC HARDCORE PROCESSOR
Interrupt Enabled Inputs
Object Array stored in Block RAM
GPS via COM0
Servo Control Board via COM1
Initial direction of destination is determined using the GPS/transceiver data.
Objects are then realized in software.
Movement vector is determined based on direction and relative position of objects.
IR sensors are then used to update path model due to “invisible objects.”
Given our current position (in the grid coordinate system) from the GPS, we calculate our north/south and east/west displacement from the destination position and use that to calculate a destination direction vector.
We then calculate the deviation of our current direction from the destination direction and rotate the robot the required amount to correct this difference.
The transceiver will determine a direction and distance of the destination transceiver.
We then calculate the deviation of our current direction from the destination direction and rotate the robot the required amount to correct this difference, just as with GPS.
Transceiver is necessary because the GPS is only accurate to ~3m.
To realize objects we will create a matrix representing the relative visible area in the direction of the destination.
Object input in the form of a distance and deflection will then be used to populate this matrix with objects.
First we determine acceptable directions to travel based on object positions.
We then evaluate a vector in each valid direction space.
The length of movement vectors will place the robot just past the nearest object.
We then compare the valid vectors and choose the one that places the robot closest to the destination.
Instructions are then given to the servos to correct direction and move forward the determined distance.
IR interrupts cause immediate motion halt.
Object is populated into matrix using IR sensor determined direction and distance.
Path is then re-evaluated using updated object matrix.
Robot that will move to a specified GPS location.
Object recognition hardware completed.
Robot that will move to a specified GPS location avoiding 'visible' objects.
Robot that will move to a destination object, via GPS and transceiver, and avoid objects using video imaging and IR sensors.
ADC interface to the FPGA – Loading images into memory
Wheels operating at same speeds
Inaccurate calculation of cart speed to judge distance traveled
Interfacing to the RF transceiver (signal magnitude not direction)
Cart frame parasitic to GPS or RF signals