230 likes | 322 Views
Explore the development of behaviors in robots with URBI framework, covering obstacle detection, walk postures, voice and face recognition. Incorporate basic behaviors, transitions, and reactions utilizing low-level access and high-level functionalities. Discover the potential of parallel processing, integration with other algorithms, and ongoing voice and face recognition projects. Have fun with scenarios like toy search and understand the CPU load implications.
E N D
Creation of Behaviors using URBI Diego Pardo GREC – Grup de Recerca Enginyeria Coneixement Technical University of Catalonia
Aibo Simple Reactive Behaviors Obstacle Detection Walk Posture Voice Recognition Face Recognition
The Basic Behaviors Walk Up Right 12DOF GaitParameterForward / Backwards / TurnInitialization ProcessCompatible with Others Behaviors
The Basic Behaviors Postures Sit / Stand / LieTransitionsPosture Detector
The Basic Behaviors Obstacle Detection Cliff DetectionObjects DetectionSensor CharacterizationReaction Routine
Programming Tools Open-R Remote Framework R-Code Tekkotsu URBI
Urbi-Mind Scenario Remote FrameWork Read LowLevel Data Basic behaviors building blocks Low level access (Read/Write) URBI Math functions, float point, etc AiboMind Functions Behaviors
Walk RMF Walk J1 – Front Left and Right
Walk URBI J1 – Front Left and Right
Walk URBI J1 – Front Left and Right / Surface Change
def NormalWalk(Wsteps) { neck=-20 smooth :1s, headPan = 0 smooth :1s, headTilt = 0 smooth :1s, if (Wsteps < 0) direction = 0 else direction = 1; Wsteps=abs(Wsteps); for | (j=0;j<Wsteps;j++) for & (m=0;m<6;m++) JF(m,direction) };
Postures Unknown Lie Stand Sit State Model for Postures Transition
Postures Joint – Transition Points
Postures def robot.StandUp(){ wait(300); a=robot.myguessPosition(); if(a=="stand") {return 0;ledF=1|wait(500)|ledF=0}; if(a=="unknown") {robot.initial() | robot.lie2stand()} else { if(a=="sit") robot.sit2stand() else {if(a=="lie") robot.lie2stand()}};}; Joint – Transition Points
Obstacle Detection Chest Sensor Output- UrbiMatlab Lib
Obstacle Detection cdetect : every(200ms) cliffalert(),
Obstacle Detection cdetect : every(200ms) obstalert(),
Emerged Behavior LetsFindToy(); Goal : Search of the pink ballRandom Search Behavior: Walk, Turn, Sit, Lie Tail, Ear MovementsFinal Sequence
Interface It simpledemo1: at(headSensor) global.hcounter++; simpledemo2: at(global.hcounter){wait(1000); global.hcounter=0;}, simpledemo3: at(global.hcounter==3){ global.hcounter=0;cleanall(); speaker.play("bark.wav"); motor on; wait(1s);load("foundtoy.u");}, simpledemo4: at(backSensorR){global.bk++;}, simpledemo5: at(global.bk) {wait(1000);global.bk=0;}, simpledemo6: at(global.bk==3) {cleanall();global.bk=0}, Start Stop
Emerged Behavior LetsFindToy();
How Complex? CPU-load level through the Behavior
Conclusions and Future Work • An Urbi-Mind has been created. • Routines could be used in simple ‘end user’ applications. • Routines could be integrated in higher level algorithms. • Paralleling processing features are crucial in behavior based robot control. • Voice and Face Recognition are under construction using Open Source Lib
Have Fun at &(speaker.remain=~=29025 || speaker.remain=~=28775 || speaker.remain=~=28525 ) { boca(), },