slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Outline PowerPoint Presentation
Download Presentation
Outline

Loading in 2 Seconds...

play fullscreen
1 / 17

Outline - PowerPoint PPT Presentation


  • 122 Views
  • Uploaded on

VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso , D. Leonello Politecnico di Milano AHS International Specialists' Meeting on Unmanned Rotorcraft Phoenix, AZ, January 20-22, 2009. Outline. Introduction and motivation

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Outline' - duena


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLEC.L. Bottasso, D. LeonelloPolitecnico di MilanoAHS International Specialists' Meeting onUnmanned Rotorcraft Phoenix, AZ, January 20-22, 2009

outline
Outline
  • Introduction and motivation
  • Inertial navigation by measurement fusion
  • Vision-augmented inertial navigation
    • - Stereo projection and vision-based position sensors
    • - Vision-based motion sensors
    • - Outlier rejection
  • Results and applications
  • Conclusions and outlook
slide3

Rotorcraft UAVs at PoliMI

  • Low-cost platform for development and testing of navigation and control strategies (including vision, flight envelope protection, etc.)
  • Vehicles: off-the-shelf hobby helicopters
  • On-board control hardware based on PC-104 standard
  • Bottom-up approach, everything is developedin-house:
    • - Inertial Navigation System (this paper)
    • - Guidance and Control algorithms (AHS UAV `07: C.L. Bottasso et al., path planning by motion primitives, adaptive flight control laws)
    • - Linux-based real-time OS
    • - Flight simulators
    • - System identification (estimation of inertia, estimation of aerodynamic model parameters from flight test data)
    • - Etc. etc.
slide4

UAV Control Architecture

Hierarchical three-layer control architecture (Gat 1998):

  • Strategic layer: assign mission objectives (typically relegated to a human operator)
  • Tactical layer: generate vehicle guidance information, based on input from strategic layer and ambient mapping information
  • Reflexive layer: track trajectory generated by tactical layer, control, stabilize and regulate vehicle

Obstacles

Sense vehicle state of motion (to enable planning and tracking)

Sense environment (to enable mapping)

Target

slide5

Sensing of vehicle motion states

GPS

Proposed approach:

Recruit vision sensors for improved state estimation

Accelerometer

Gyro

Sensor fusion algorithm

State Estimates

Sonar altimeter

  • Advantages:
  • Improved accuracy/better estimates, especially when in proximity of obstacles
  • Sensor loss tolerant (e.g. because of faults, or GPS loss indoors, under vegetation or in urban canyons, etc.)

Magnetometer

Other sensors

Sensing of environment for mapping

(This paper)

Stereo cameras

Ambient map Obstacle/target recognition

Sensor fusion algorithm

Laser scanner

Other sensors

slide6

T

T

T

T

T

T

T

T

T

T

T

T

T

E

E

E

B

E

B

T

T

(

(

)

)

(

(

)

)

h

h

:

:

:

:

u

z

v

a

r

!

m

y

x

v

v

r

!

r

m

q

=

=

=

=

G

B

O

G

O

B

s

o

n

a

r

;

;

;

;

;

;

;

;

;

;

g

a

p

c

s

c

g

g

p

y

s

r

o

m

a

g

n

¡

¢

(

)

(

)

(

)

(

(

)

)

(

)

^

f

_

t

t

t

t

t

t

¹

+

x

x

x

x

u

º

=

=

k

k

1

1

+

+

;

;

¡

¡

¢

¢

(

(

)

)

(

(

)

)

(

)

h

K

t

t

t

t

t

¹

¡

y

z

x

y

=

k

k

k

k

k

1

1

1

+

+

+

(

)

(

)

(

)

t

t

t

+

z

y

¹

=

k

k

k

Classical Navigation System

Sensor fusion by Kalman-type filtering to account for measurement and process noise:

States:

Inputs:

Outputs

Measures:

slide7

Vision-Based Navigation System

Kanade-Lucas-Tomasi tracker:

Tracks feature points in the scene across the stereo cameras and across time steps

Each tracked point becomes vision-based motion sensor

Has own internal outlier rejection algorithm

GPS

Accelerometer

Gyro

Sensor fusion algorithm

State Estimates

Sonar altimeter

Magnetometer

Other sensors

Vision sensors

Stereo cameras

KLT

Outlier rejection

slide8

0

0

0

0

0

(

)

0

d

d

f

b

d

C

C

O

P

B

¡

c

c

c

c

p

r

p

¼

p

p

=

=

O

B

1

1

2

3

1

b

C

C

d

p

=

d

Vision-Based Position Sensor

Feature point projection:

Stereo vision:

disparity

computed with Kanade-Lucas-Tomasi (KLT) algorithm

Effect of one pixel error on estimated distance (BumbleBee X3 camera) ▶

Remark: stereo vision info from low res cameras is noisy, need care

slide9

Feature Point Tracking

▼ Left camera

▼ Right camera

Tracking across cameras

Tracking across time steps

Time k ▶

Time k+1 ▶

slide10

0

0

0

0

0

d

b

f

P

C

O

B

C

c

p

c

r

c

c

O

B

2

3

1

d

C

T

T

E

B

B

C

¡

¢

(

)

d

_

M

C

R

C

E

B

C

¡

+

£

+

(

)

p

v

!

c

d

R

R

C

0

=

+

+

B

r

c

=

d

t

Vision-Based Motion Sensor

Differentiate the vector closure expression:

Apparent motion of feature point on image plane (motion sensor):

Attitude

Linear velocity

Angular velocity

slide11

T

T

T

T

T

0

T

T

T

T

T

T

C

C

E

E

B

T

(

0

)

(

(

)

(

)

)

h

d

d

:

h

d

d

:

t

t

k

k

+

+

1

1

z

v

r

m

y

v

r

m

=

=

k

k

1

1

+

+

i

i

i

i

G

O

G

s

o

n

a

r

;

;

;

;

:

:

:

;

;

;

:

:

:

;

;

;

;

:

:

:

;

;

;

:

:

:

g

p

s

g

p

s

m

a

g

n

v

s

o

n

v

s

o

n

Vision-Based Motion Sensor

  • For all tracked feature points, write motion sensor equation
  • This defines a new output of the vehicle states
  • Measure apparent motion of feature pt:
  • Fuse in parallel with all other sensors using Kalman filtering

This defines a new augmented measurement vector: GPS, gyro, accelerometer, magnetometer, altimeter readings + two (left & right cameras) vision sensor per tracked feature point

Measured apparent velocity (due to vehicle motion)

slide12

Outlier Rejection

  • Outlier:
    • A point which is not fixed wrt to the scene
    • A false positive in the tracking KLT algorithm
  • Outliers give false info on the state of motion, need a way to discard them from the process
  • Two stage rejection:
  • KLT internal
  • Vehicle motion compatibility check

Apparent point velocity due to estimated vehicle motion

Measured apparent velocity

Drop tracked point if the two vectors differ too much in length and direction ▶

slide13

Examples: Pirouette Around Point Cloud

  • Cloud of about 100 points
  • Temporary loss of GPS signal (for 100 sec < t < 200 sec)
  • To show conservative results:
  • Only <=20 tracked points at each instant of time
  • Vision sensors operating at 1Hz
slide14

Examples: Pirouette Around Point Cloud

Filter warm-up

End GPS signal loss

Begin GPS signal loss

Classical non-vision-based IMU

Vision-based IMU

Remark:

No evident effect of GPS loss on state estimation for vision-augmented IMU

slide15

Examples: Flight in a Village

  • Rectangular flight path in a village at 2m of altitude
  • Three loops:
  • With GPS
  • Without GPS
  • With GPS

◀ Scene environment and image acquisition based on Gazebo simulator

slide16

Examples: Flight in a Village

End GPS signal loss

End GPS signal loss

Begin GPS signal loss

Begin GPS signal loss

Filter warm-up

Filter warm-up

Some degradation of velocity estimates without GPS

Some degradation of velocity estimates without GPS

◀ No evident effect of GPS loss on attitude estimation

slide17

Conclusions

  • Proposed novel inertial navigation system using vision motion sensors
  • Basic concept demonstrated in a high-fidelity virtual environment
  • Observed facts:
  • Improved observability of vehicle states
  • No evident transients during loss and reacquisition of sensor signals
  • Higher accuracy when close to objects and for increasing number of tracked points
  • Computational cost compatible with
  • on-board hardware (PC-104 Pentium III)
  • Outlook:
  • Testing in the field
  • Adaptive filtering: better robustness/tuning
  • Recruitment of additional sensors (e.g. stereo laser-scanner)

BumbleBee X3 camera