Advertisement
/ 63 []

Using Lego Mindstorms NXT in the Classroom


Robot scans both directions when track lost. Each pair of scans ... Implement a fuzzy expert system for the robot to perform a task. Students given code for ...

Download Presentation

Lego Mindstorms NXT Tutorial - Using Lego Mindstorms NXT in the ...

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use only and may not be sold or licensed nor shared on other sites. SlideServe reserves the right to change this policy at anytime.While downloading, If for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.











- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -




Presentation Transcript


Using lego mindstorms nxt in the classroom l.jpg

Using Lego Mindstorms NXT in the Classroom

Gabriel J. Ferrer

Hendrix College

ferrer@hendrix.edu

http://ozark.hendrix.edu/~ferrer/


Outline l.jpg

Outline

  • NXT capabilities

  • Software development options

  • Introductory programming projects

  • Advanced programming projects


Purchasing nxt kits l.jpg

Purchasing NXT Kits

  • Two options (same price; $250/kit)

    • Standard commercial kit

    • Lego Education kit

      • http://www.lego.com/eng/education/mindstorms/

  • Advantages of education kit

    • Includes rechargeable battery ($50 value)

    • Plastic box superior to cardboard

    • Extra touch sensor (2 total)

  • Standard commercial kit

    • Includes NXT-G visual language


Nxt brick features l.jpg

NXT Brick Features

  • 64K RAM, 256K Flash

  • 32-bit ARM7 microcontroller

  • 100 x 64 pixel LCD graphical display

  • Sound channel with 8-bit resolution

  • Bluetooth radio

  • Stores multiple programs

    • Programs selectable using buttons


Sensors and motors l.jpg

Sensors and Motors

  • Four sensor ports

    • Sonar

    • Sound

    • Light

    • Touch

  • Three motor ports

    • Each motor includes rotation counter


Touch sensors l.jpg

Touch Sensors

  • Education kit includes two sensors

  • Much more robust than old RCX touch sensors


Light sensor l.jpg

Light Sensor

  • Reports light intensity as percentage

  • Two modes

    • Active

    • Passive

  • Practical uses

    • Identify intensity on paper

    • Identify lit objects in dark room

    • Detect shadows


Sound sensor l.jpg

Sound Sensor

  • Analogous to light sensor

    • Reports intensity

    • Reputed to identify tones

      • I haven’t experimented with this

  • Practical uses

    • “Clap” to signal robot


Ultrasonic sonar sensor l.jpg

Ultrasonic (Sonar) Sensor

  • Reports distances

    • Range: about 5 cm to 250 cm

    • In practice:

      • Longer distances result in more missed “pings”

  • Mostly reliable

    • Occasionally gets “stuck”

    • Moving to a new location helps in receiving a sonar “ping”


Motors l.jpg

Motors

  • Configured in terms of percentage of available power

  • Built-in rotation sensors

    • 360 counts/rotation


Software development options l.jpg

Software development options

  • Onboard programs

    • RobotC

    • leJOS

    • NXC/NBC

  • Remote control

    • iCommand

    • NXT_Python


Robotc l.jpg

RobotC

  • Commercially supported

    • http://www.robotc.net/

  • Not entirely free of bugs

  • Poor static type checking

  • Nice IDE

  • Custom firmware

  • Costly

    • $50 single license

    • $250/12 classroom computers


Example robotc program l.jpg

Example RobotC Program

void forward() {

motor[motorA] = 100;

motor[motorB] = 100;

}

void spin() {

motor[motorA] = 100;

motor[motorB] = -100;

}


Example robotc program14 l.jpg

Example RobotC Program

task main() {

SensorType[S4] = sensorSONAR;

forward();

while(true) {

if (SensorValue[S4] < 25) spin();

else forward();

}

}


Lejos l.jpg

leJOS

  • Implementation of JVM for NXT

  • Reasonably functional

    • Threads

    • Some data structures

    • Garbage collection added (January 2008)

    • Eclipse plug-in just released (March 2008)

  • Custom firmware

  • Freely available

    • http://lejos.sourceforge.net/


Example lejos program l.jpg

Example leJOS Program

sonar = newUltrasonicSensor(SensorPort.S4);

Motor.A.forward();

Motor.B.forward();

while (true) {

if (sonar.getDistance() < 25) {

Motor.A.forward();

Motor.B.backward();

} else {

Motor.A.forward();

Motor.B.forward();

}

}


Event driven control in lejos l.jpg

Event-driven Control in leJOS

  • The Behavior interface

    • boolean takeControl()

    • void action()

    • void suppress()

  • Arbitrator class

    • Constructor gets an array of Behavior objects

      • takeControl() checked for highest index first

    • start() method begins event loop


Event driven example l.jpg

Event-driven example

class Go implements Behavior {

private Ultrasonic sonar =

new Ultrasonic(SensorPort.S4);

public boolean takeControl() {

return sonar.getDistance() > 25;

}


Event driven example19 l.jpg

Event-driven example

public void action() {

Motor.A.forward();

Motor.B.forward();

}

public void suppress() {

Motor.A.stop();

Motor.B.stop();

}

}


Event driven example20 l.jpg

Event-driven example

class Spin implements Behavior {

private Ultrasonic sonar =

new Ultrasonic(SensorPort.S4);

public boolean takeControl() {

return sonar.getDistance() <= 25;

}


Event driven example21 l.jpg

Event-driven example

public void action() {

Motor.A.forward();

Motor.B.backward();

}

public void suppress() {

Motor.A.stop();

Motor.B.stop();

}

}


Event driven example22 l.jpg

Event-driven example

public class FindFreespace {

public static void main(String[] a) {

Behavior[] b = new Behavior[]

{new Go(), new Stop()};

Arbitrator arb =

new Arbitrator(b);

arb.start();

}

}


Nxc nbc l.jpg

NXC/NBC

  • NBC (NXT Byte Codes)

    • Assembly-like language with libraries

    • http://bricxcc.sourceforge.net/nbc/

  • NXC (Not eXactly C)

    • Built upon NBC

    • Successor to NQC project for RCX

  • Compatible with standard firmware

    • http://mindstorms.lego.com/Support/Updates/


Icommand l.jpg

iCommand

  • Java program runs on host computer

  • Controls NXT via Bluetooth

  • Same API as leJOS

    • Originally developed as an interim project while leJOS NXT was under development

    • http://lejos.sourceforge.net/

  • Big problems with latency

    • Each Bluetooth transmission: 30 ms

    • Sonar alone requires three transmissions

    • Decent program: 1-2 Hz


Nxt python l.jpg

NXT_Python

  • Remote control via Python

    • http://home.comcast.net/~dplau/nxt_python/

  • Similar pros/cons with iCommand


Developing a remote control api l.jpg

Developing a Remote Control API

  • Bluetooth library for Java

    • http://code.google.com/p/bluecove/

  • Opening a Bluetooth connection

    • Typical address: 00:16:53:02:e5:75

  • Bluetooth URL

    • btspp://00165302e575:1; authenticate=false;encrypt=false


Opening the connection l.jpg

Opening the Connection

import javax.microedition.io.*;

import java.io.*;

StreamConnection con = (StreamConnection) Connector.open(“btspp:…”);

InputStream is = con.openInputStream();

OutputStream os = con.openOutputStream();


Nxt protocol l.jpg

NXT Protocol

  • Key files to read from iCommand:

    • NXTCommand.java

    • NXTProtocol.java


An interesting possibility l.jpg

An Interesting Possibility

  • Programmable cell phones with cameras are available

  • Camera-equipped cell phone could provide computer vision for the NXT


Introductory programming projects l.jpg

Introductory programming projects

  • Developed for a zero-prerequisite course

  • Most students are not CS majors

  • 4 hours per week

    • 2 meeting times

    • 2 hours each

  • Not much work outside of class

    • Lab reports

    • Essays


First project 1 l.jpg

First Project (1)

  • Introduce motors

    • Drive with both motors forward for a fixed time

    • Drive with one motor to turn

    • Drive with opposing motors to spin

  • Introduce subroutines

    • Low-level motor commands get tiresome

  • Simple tasks

    • Program a path (using time delays) to drive through the doorway


First project 2 l.jpg

First Project (2)

  • Introduce the touch sensor

    • if statements

      • Must touch the sensor at exactly the right time

    • while loops

      • Sensor is constantly monitored

  • Interesting problem

    • Students try to put code in the loop body

      • e.g. set the motor power on each iteration

    • Causes confusion rather than harm


First project 3 l.jpg

First Project (3)

  • Combine infinite loops with conditionals

  • Enables programming of alternating behaviors

    • Front touch sensor hit => go backward

    • Back touch sensor hit => go forward


Second project 1 l.jpg

Second Project (1)

  • Physics of rotational motion

  • Introduction of the rotation sensors

    • Built into the motors

  • Balance wheel power

    • If left counts < right counts

      • Increase left wheel power

  • Race through obstacle course


Second project 2 l.jpg

Second Project (2)

if (/* Write a condition to put here */) { nxtDisplayTextLine(2, "Drifting left");

} else if (/* Write a condition to put here */) { nxtDisplayTextLine(2, "Drifting right");

} else {

nxtDisplayTextLine(2, "Not drifting");

}


Third project l.jpg

Third Project

  • Pen-drawer

    • First project with an effector

    • Builds upon lessons from previous projects

  • Limitations of rotation sensors

    • Slippage problematic

    • Most helpful with a limit switch

  • Shapes (Square, Circle)

  • Word (“LEGO”)

    • Arguably excessive


Pen drawer robot l.jpg

Pen-Drawer Robot


Pen drawer robot38 l.jpg

Pen-Drawer Robot


Fourth project 1 l.jpg

Fourth Project (1)

  • Finding objects

  • Light sensor

    • Find a line

  • Sonar sensor

    • Find an object

    • Find freespace


Fourth project 2 l.jpg

Fourth Project (2)

  • Begin with following a line edge

    • Robot follows a circular track

    • Always turns right when track lost

    • Traversal is one-way

  • Alternative strategy

    • Robot scans both directions when track lost

    • Each pair of scans increases in size


Fourth project 3 l.jpg

Fourth Project (3)

  • Once scanning works, replace light sensor reading with sonar reading

  • Scan when distance is short

    • Finds freespace

  • Scan when distance is long

    • Follow a moving object


Light sensor sonar robot l.jpg

Light Sensor/Sonar Robot


Other projects l.jpg

Other Projects

  • “Theseus”

    • Store path (from line following) in an array

    • Backtrack when array fills

  • Robotic forklift

    • Finds, retrieves, delivers an object

  • Perimeter security robot

    • Implemented using RCX

    • 2 light sensors, 2 touch sensors

  • Wall-following robot

    • Build a rotating mount for the sonar


Robot forklift l.jpg

Robot Forklift


Gearing the motors l.jpg

Gearing the motors


Advanced programming projects l.jpg

Advanced programming projects

  • From a 300-level AI course

  • Fuzzy logic

  • Reinforcement learning


Fuzzy logic l.jpg

Fuzzy Logic

  • Implement a fuzzy expert system for the robot to perform a task

  • Students given code for using fuzzy logic to balance wheel encoder counts

  • Students write fuzzy experts that:

    • Avoid an obstacle while wandering

    • Maintain a fixed distance from an object


Fuzzy rules for balancing rotation counts l.jpg

Fuzzy Rules for Balancing Rotation Counts

  • Inference rules:

    • biasRight => leftSlow

    • biasLeft => rightSlow

    • biasNone => leftFast

    • biasNone => rightFast

  • Inference is trivial for this case

    • Fuzzy membership/defuzzification is more interesting


Fuzzy membership functions l.jpg

Fuzzy Membership Functions

  • Disparity = leftCount - rightCount

  • biasLeft is

    • 1.0 up to -100

    • Decreases linearly down to 0.0 at 0

  • biasRight is the reverse

  • biasNone is

    • 0.0 up to -50

    • 1.0 at 0

    • falls to 0.0 at 50


Defuzzification l.jpg

Defuzzification

  • Use representative values:

    • Slow = 0

    • Fast = 100

  • Left wheel:

    • (leftSlow * repSlow + leftFast * repFast) / (leftSlow + leftFast)

  • Right wheel is symmetric

  • Defuzzified values are motor power levels


Q learning l.jpg

Q-Learning

  • Discrete sets of states and actions

    • States form an N-dimensional array

      • Unfolded into one dimension in practice

    • Individual actions selected on each time step

  • Q-values

    • 2D array (indexed by state and action)

    • Expected rewards for performing actions


Q learning main loop l.jpg

Q-Learning Main Loop

  • Select action

  • Change motor speeds

  • Inspect sensor values

    • Calculate updated state

    • Calculate reward

  • Update Q values

  • Set “old state” to be the updated state


Calculating the state motors l.jpg

Calculating the State (Motors)

  • For each motor:

    • 100% power

    • 93.75% power

    • 87.5% power

  • Six motor states


Calculating the state sensors l.jpg

Calculating the State (Sensors)

  • No disparity: STRAIGHT

  • Left/Right disparity

    • 1-5: LEFT_1, RIGHT_1

    • 6-12: LEFT_2, RIGHT_2

    • 13+: LEFT_3, RIGHT_3

  • Seven total sensor states

  • 63 states overall


Action set for balancing rotation counts l.jpg

Action Set for Balancing Rotation Counts

  • MAINTAIN

    • Both motors unchanged

  • UP_LEFT, UP_RIGHT

    • Accelerate motor by one motor state

  • DOWN_LEFT, DOWN_RIGHT

    • Decelerate motor by one motor state

  • Five total actions


Action selection l.jpg

Action Selection

  • Determine whether action is random

    • Determined with probability epsilon

  • If random:

    • Select uniformly from action set

  • If not:

    • Visit each array entry for the current state

    • Select action with maximum Q-value from current state


Q learning main loop57 l.jpg

Q-Learning Main Loop

  • Select action

  • Change motor speeds

  • Inspect sensor values

    • Calculate updated state

    • Calculate reward

  • Update Q values

  • Set “old state” to be the updated state


Calculating reward l.jpg

Calculating Reward

  • No disparity => highest value

  • Reward decreases with increasing disparity


Updating q values l.jpg

Updating Q-values

Q[oldState][action] =

Q[oldState][action] +

learningRate *

(reward + discount * maxQ(currentState) - Q[oldState][action])


Student exercises l.jpg

Student Exercises

  • Assess performance of wheel-balancer

  • Experiment with different constants

    • Learning rate

    • Discount

    • Epsilon

  • Alternative reward function

    • Based on change in disparity


Learning to avoid obstacles l.jpg

Learning to Avoid Obstacles

  • Robot equipped with sonar and touch sensor

  • Hitting the touch sensor is penalized

  • Most successful formulation:

    • Reward increases with speed

    • Big penalty for touch sensor


Other classroom possibilities l.jpg

Other classroom possibilities

  • Operating systems

    • Inspect, document, and modify firmware

  • Programming languages

    • Develop interpreters/compilers

    • NBC an excellent target language

  • Supplementary labs for CS1/CS2


Thanks for attending l.jpg

Thanks for attending!

  • Slides available on-line:

    • http://ozark.hendrix.edu/~ferrer/presentations/

  • Currently writing lab textbook

    • Introductory and advanced exercises

  • ferrer@hendrix.edu