autonomous, agile, vision-controlled dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - aerial...

41
Davide Scaramuzza Autonomous, Agile, Vision-controlled Drones: From Frame to Event Vision Institute of Informatics – Institute of Neuroinformatics My lab homepage: http:// rpg.ifi.uzh.ch/ Publications: http://rpg.ifi.uzh.ch/publications.html Software & Datasets: http:// rpg.ifi.uzh.ch/software_datasets.html YouTube: https:// www.youtube.com/user/ailabRPG/videos

Upload: others

Post on 03-Jun-2020

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Davide Scaramuzza

Autonomous, Agile, Vision-controlled Drones:

From Frame to Event Vision

Institute of Informatics – Institute of Neuroinformatics

My lab homepage: http://rpg.ifi.uzh.ch/

Publications: http://rpg.ifi.uzh.ch/publications.html

Software & Datasets: http://rpg.ifi.uzh.ch/software_datasets.html

YouTube: https://www.youtube.com/user/ailabRPG/videos

Page 2: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

My Goal: Flying Robots to the Rescue!

Page 3: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

1st visual-SLAM-based autonomous navigation. Based on PTAM. Running online but offboard (using a 20 m long USB cable )

1. Bloesch, ICRA 2010

2. Weiss, JFR’11 https://www.youtube.com/watch?v=51SYZpd1s8I

Page 4: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Vision-based autonomous flight in GPS-denied Environments with onboard sensing and computing (based on modified PTAM, running @30Hz on Intel Atom)

Project outcomes: PX4 Autopilot (populirized by 3DR; today used in over half a million drones

Firefly vision-based hexacopter, by AscTec (acquired by Intel in 2016)

Scaramuzza, Fraundorfer, Pollefeys, Siegwart, Weiss, et al., Vision-Controlled Micro Flying Robots:

from System Design to Autonomous Navigation and Mapping in GPS-denied Environments, RAM’14

https://youtu.be/_-p08o_oTO4

Page 5: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Autonomous Flight

Event-based Vision for Low-latency Control

Visual-Inertial State Estimation (~SLAM)

Overview of my Research

Learning-aided Autonomous Navigation

Real-time, Onboard Computer Vision and Control for Autonomous, Agile Drone Flight

http://rpg.ifi.uzh.ch/research_mav.htmlhttp://rpg.ifi.uzh.ch/research_vo.html

http://rpg.ifi.uzh.ch/research_learning.ht

ml

http://rpg.ifi.uzh.ch/research_dvs.html

Page 6: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Position error: 5 mm, height: 1.5 m – Down-looking camera

[ICRA’10-17, AURO’12, RAM’14, JFR’15, RAL’17]

Automatic recovery from aggressive flight [ICRA’15]Robustness to dynamic scenes (down-looking camera)

Speed: 4 m/s, height: 3 m – Down-looking camera

https://youtu.be/LssgKdDz5z0

https://youtu.be/fXy4P3nvxHQ

https://youtu.be/pGU1s6Y55JI

Page 7: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Davide Scaramuzza – University of Zurich – [email protected]

Autonomus, Live, Dense Reconstruction

REMODE: probabilistic, REgularized, MOnocular DEnse reconstruction in real time [ICRA’14]State estimation with SVO 2.0

1. Pizzoli et al., REMODE: Probabilistic, Monocular Dense Reconstruction in Real Time, ICRA’14]2. Forster et al., Appearance-based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicles, RSS’ 143. Forster et al., Continuous On-Board Monocular-Vision-based Elevation Mapping Applied ..., ICRA’15.4. Faessler et al., Autonomous, Vision-based Flight and Live Dense 3D Mapping ..., JFR’15

Open Sourcehttps://github.com/uzh-rpg/rpg_open_remode

Running live at 50Hz on laptop GPU – HD res.Running at 25Hz onboard (Odroid U3) - Low res.

https://youtu.be/phaBKFwfcJ4 https://youtu.be/7-kPiWaFYAc

Page 8: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

What’s next?

Page 9: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Davide Scaramuzza – University of Zurich – [email protected]

My Dream Robot: Fast, Lightweight, Autonomous!

WARNING! There are 50 drones in this video but 40 are CGI and

10 are controlled via a Motion Capture System

LEXUS commercial, 2013 – Created by Kmel, now Qualcomm

Video: https://youtu.be/JDvcBuRSDUU

Page 10: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

But this is just a vision!How to get there?

Page 11: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Davide Scaramuzza – University of Zurich – [email protected]

Open Challenges

Perception algorithms are mature but not robust

• Problems with low texture, HDR scenes, motion blur

• Algorithms and sensors have big latencies (50-200 ms) → need faster sensors

• Need accurate models of the sensors and the environment

• Control & Perception are often considered separately (e.g., perception, state estimation, and planning are treated as separate blocks)

Page 12: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Perception

Standard Perception-Action Loops: Loosely Coupled

ObservationsState Estimation,

Mapping,Obstacle detection

PlanningLow level

controlMotor

commands

Physical world (robot & environment)

Action

Page 13: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

ActionPerception

Tight Coupling of Perception-Action Loops

ObservationsState Estimation,

Mapping,Obstacle detection

PlanningLow level

controlMotor

commands

Physical world (robot & environment)

Example of tight coupling of perception and actions loops

favor texture rich areas [Costante, Arxiv’16]

minimize speed-to-height ratios (to reduce blur)

focus on points of interest

Page 14: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Goal: Bridge the gap between perception and control

● Finds optimal trajectories respecting

○ system dynamics

○ action objectives(i.e. tracking)

○ perception objectives(i.e. maximize visibility)

○ input limits(i.e. thrust, actuator saturation)

● Capable of replanning heading and trajectory to improve perception improve

Plan

ExecuteSense

Falanga, Foehn, Peng, Scaramuzza, PAMPC: Perception-Aware Model Predictive Cotrol, IROS 2018.

PAMPC: Perception Aware Model Predictive Control

Page 15: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Philipp Foehn Perception Aware Model Predictive Control

Manipulates trajectories to increase visibility of points of interest

Controls heading to look towards point of interest

Robustly tracks reference jumps or trajectories

Execution time: 3.5 ms on a single core of a low-power ARM processor (Qualcomm Snapdragon or Odroid). Runs online with 100Hz

PAMPC: Perception Aware Model Predictive Control

Falanga, Foehn, Peng, Scaramuzza, PAMPC: Perception-Aware Model Predictive Cotrol, IROS 2018.

Page 16: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

PAMPC: Methodology

Sequential Quadratic Programming

(1 Iteration per loop at 100Hz)

Perception States Costs

Optimization Problem

Camera Projection

optimaltrajectory

reference

𝒙𝟎

𝒙𝟏𝒙𝟐 𝒙𝟑

𝒙𝟒

𝒙𝟓

𝒙𝟔

𝒙𝟕𝒙𝟖

𝒙𝟗𝒙𝟏𝟎

𝒙𝒆

Falanga, Foehn, Peng, Scaramuzza, PAMPC: Perception-Aware Model Predictive Cotrol, IROS 2018.

Page 17: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Open Source:MPC: https://github.com/uzh-rpg/rpg_quadrotor_mpc

PAMPC: Perception Aware Model Predictive Control

Falanga, Foehn, Peng, Scaramuzza, PAMPC: Perception-Aware Model Predictive Cotrol, IROS 2018.

https://www.youtube.com/watch?v=9vaj829vE18

Page 18: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Open Source:MPC: https://github.com/uzh-rpg/rpg_quadrotor_mpc

PAMPC: Perception Aware Model Predictive Control

https://www.youtube.com/watch?v=9vaj829vE18

Falanga, Foehn, Peng, Scaramuzza, PAMPC: Perception-Aware Model Predictive Cotrol, IROS 2018.

Page 19: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Open Source:MPC: https://github.com/uzh-rpg/rpg_quadrotor_mpc

PAMPC: Perception Aware Model Predictive Control

https://www.youtube.com/watch?v=9vaj829vE18

Falanga, Foehn, Peng, Scaramuzza, PAMPC: Perception-Aware Model Predictive Cotrol, IROS 2018.

Page 20: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Open Source:MPC: https://github.com/uzh-rpg/rpg_quadrotor_mpc

PAMPC: Perception Aware Model Predictive Control

https://www.youtube.com/watch?v=9vaj829vE18

Falanga, Foehn, Peng, Scaramuzza, PAMPC: Perception-Aware Model Predictive Cotrol, IROS 2018.

Page 21: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Deep-Learning based Navigation

Page 22: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

22

DroNet: Learning to Fly by Driving [RAL’18] Network infers Steering Angle and Collision Probability directly from input images

Steering Angle learned from Udacity Car Dataset, Collision Prob. from Bicycle Dataset

Shallow 8-layer CNN specifically designed to run @30Hz on CPU (Intel i7, 2GHz) (no GPU)

Code & Datasets:http://rpg.ifi.uzh.ch/dronet.html

[Loquercio, DroNet: Learning to Fly by Driving, IEEE RAL’18PDF. Featured on IEEE Spectrum, MIT Technology Review, and Discovery Channel Global

Video:

https://youtu.be/ow7aw9H4BcA

Page 23: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

23

Future: Autonomous Drone Racing

Page 24: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

24

Challenges in Autonomous Drone Racing

Autonomous agile flight brings up fundamental challenges in robotics research:

coping with unreliable state estimation,

reacting optimally to dynamically changing environments,

and coupling perception and action in real time under severe resource constraints

- Looking for the next gate

- Minimize motion blur (choose paths that minimize speed-to-height ratios)

An easy way to approach the problem is to use SLAM:

Build an initial global map during rehearsal

Generate a globally optimal minimum-time trajectory

Execute the path while localizing against map

But:

- SLAM may fail when localizing against a map that was created in significantly different conditions,

- May fail during periods of high acceleration (because of motion blur and loss of feature tracking)

- Cannot cope with dynamic environments

Page 25: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

25

Deep Drone Racing: Main Idea

Combine the perceptual awareness of a convolutional neural network with the speed and accuracy of a state-of-the-art trajectory generation and tracking pipeline.

Global map of the environment not required. Only local VIO.

The CNN maps every new image to a robust representation in the form of a waypoint and desired speed.

The planning module then generates a short trajectory segment and corresponding motor commands to reach the desired local goal specified by the waypoint.

All computations run fully onboard.

Page 26: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

26

Kaufmann, Loquercio, Dosovitskiy, Ranftl, Koltun, Scaramuzza, Deep Drone Racing: Learning Agile Flight in Dynamic Environments, CORL 2018, Best System Paper Award

Deep Drone Racing

https://youtu.be/8RILnqPxo1s

Page 27: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

27

Low-latency, Event-based Vision

Page 28: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Latency and Agility are tightly coupled!

Current flight maneuvers achieved with onboard cameras are still to slow compared with those attainable by birds. We need faster sensors and algorithms!

A sparrowhawk catching a garden bird (National Geographic)

https://youtu.be/Ra6I6svXQPg

Page 29: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

29

Tasks that need to be done reliably, and with low latency: Visual odometry (for control) Obstacle detection Recognition

Standard cameras are not good enough!

What does it take to fly like an eagle?

time

frame next frame

command command

latency

computation

temporal discretization

Event cameras promise to solve these three problems!

Latency Motion Blur Low Dynamic Range

Page 30: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

30

What is an event camera?

Novel sensor that measures only motion in the scene

Low-latency (~ 1 μs)

No motion blur

High dynamic range (140 dB instead of 60 dB)

Well-suited for visual odometry

Traditional visionalgorithms cannotbe used!

Mini DVS sensor from IniVation.comCheck out their booth in the exhibition hall

Page 31: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Video with DVS explanation

Camera vs Event Camera

Video: http://youtu.be/LauQ6LWTkxM

Page 32: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

32

Event Camera Standard Camera

Update rate 1MHz and asynchronous 100Hz (synchronous)

Dynamic Range High (140 dB) Low (60 dB)

Motion Blur No Yes

Absolute intensity No Yes

Contrast sensitivity Low High

Our idea: combine them!

> 60 years of research!< 10 years research

Page 33: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

33

UltimateSLAM: Visual-inertial SLAM with Events + Frames + IMU

Feature tracking from Events and Frames

Visual-inertial Fusion

Rosinol et al., Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios, IEEE RAL’18, PDF

Page 34: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Davide Scaramuzza – University of Zurich - http://rpg.ifi.uzh.ch 34

Davide Scaramuzza – University of Zurich - http://rpg.ifi.uzh.ch

Tracking by Contrast Maximization [CVPR’18]

Directly estimate the motion curves that align the events

Gallego, Rebecq, Scaramuzza, A Unifying Contrast Maximization Framework for Event Cameras, with Applications to Motion, Depth, and Optical Flow Estimation, CVPR’18, Spotlight talk

𝐻(𝒙; 𝜽) = Σ𝑘=1𝑁𝑒 𝑏𝑘𝛿(𝒙 − 𝒙𝑘

′ )

Page 35: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Tightly coupled fusion. Runs in real time on a smartphone processor.

UltimateSLAM: Events + Frames + IMU

Standard camera Event camera

Video: https://youtu.be/DyJd3a01Zlw

Rosinol et al., Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios, IEEE RAL’18, PDF

Page 36: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

HDR sequenceHigh-speed sequence

85% accuracy gain over frame-based visual-inertial odometry

Rosinol et al., Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios, IEEE RAL’18, PDF

Page 37: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

UltimateSLAM: Events + frames + IMUTightly coupled fusion. Runs in real time on a smartphone processor.

Video: https://youtu.be/DN6PaV_kht0

Rosinol et al., Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios, IEEE RAL’18, PDF

Page 38: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard
Page 39: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Low-latency Obstacle Avoidance (ongoing work)

In collaboration with Insightness company (makes event cameras and collision avoidance systems for drones)

Video: https://youtu.be/6aGx-zBSzRA

Page 40: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Conclusions

Agile flight (like birds) is still far (10 years?)

Perception and control need to be considered jointly!

Machine Learning can exploit context & provide robustness and invariance to nuisances

• Naturally incorporates perception awareness

• Should be combined with traditional model based approaches

Event cameras are revolutionary and provide:

• Robustness to high speed motion and high-dynamic-range scenes

• Allow low-latency control (ongoing work)

• Intellectually challenging: standard cameras have been studied for 50 years! → need of a change!

Page 41: Autonomous, Agile, Vision-controlled Dronesrpg.ifi.uzh.ch/docs/scaramuzza/2018.06.12 - Aerial Symposium... · Vision-based autonomous flight in GPS-denied Environments with onboard

Thanks!

Code, datasets, publications, videos: http://rpg.ifi.uzh.ch

My lab homepage: http://rpg.ifi.uzh.ch/

Publications: http://rpg.ifi.uzh.ch/publications.html

Software & Datasets: http://rpg.ifi.uzh.ch/software_datasets.html

YouTube: https://www.youtube.com/user/ailabRPG/videos