"sensing technologies for the autonomous vehicle," a presentation from nxp semiconductors

18
Copyright © 2016 2016 NXP Semiconductor 1 Tom Wilson May 3, 2016 Sensing Technologies for the Autonomous Vehicle

Upload: embedded-vision-alliance

Post on 13-Jan-2017

401 views

Category:

Technology


1 download

TRANSCRIPT

Copyright © 2016 2016 NXP Semiconductor 1

Tom Wilson

May 3, 2016

Sensing Technologies for the

Autonomous Vehicle

Copyright © 2016 2016 NXP Semiconductor 2

Driver Assistance

Partial Automation

Semi autonomous

Fully autonomous

Eyes off

ADAS: From Safety to Autonomous

ADAS: >$11B in 2016 growing to

>$130B in 2026 CAGR ~29% ABI Research, 2016

How?

“All our knowledge begins with the senses” Immanuel Kant

Copyright © 2016 2016 NXP Semiconductor 3

Autonomous

Requirement

Sensing Technology Comparison

Camera Radar LiDAR

Object Detection M H H H

Classification H M - H

Density of Raw Data H M L H

Velocity Measurement - H - H

Lane Detection H - - H

Traffic Sign Recognition H - - H

Range of Sensor M (150m) H (250m) M (100m) Full range

Rain, Fog, Snow L H L H

Night - H H H

Sensor size Small to Med Small Med Mix

Cost H (ADAS) L H Mix

Rating: H = High, M=Medium, L = Low

Copyright © 2016 2016 NXP Semiconductor 4

Car’s Eye View: Vision

t=0.1s t=0s

Optical flow for Motion Estimation Histogram of Oriented

Gradients: Pedestrians

Copyright © 2016 2016 NXP Semiconductor 5

Car’s Eye View: Radar

Pedestrian moving radially

(towards or away) Pedestrian moving laterally

Velocity is a Radar “Feature” for motion estimation

Doppler also used for classification

Copyright © 2016 2016 NXP Semiconductor 6

Car’s Eye View: LiDAR

360° Scanning LiDAR Image courtesy Velodyne

Fixed-Beam LiDAR Image courtesy Leddartech

Scan compared to map to subtract

Stationary objects

Simple detection and ranging

No classification

Copyright © 2016 2016 NXP Semiconductor 7

Proliferation of Sensors

Assist

Co-Pilot

Automated

Vision Radar LiDAR

Copyright © 2016 2016 NXP Semiconductor 8

Car’s Eye View: Vision

Fwd Facing Multi-Function ADAS Camera

LDW, TSR, Pedestrian Detection, FCW, IHC

Surround-View Object Detection, Classification

180º FOV

Rear-View Camera

Scene-view,

Object Detection

Copyright © 2016 2016 NXP Semiconductor 9

Car’s Eye View: Radar

Long Range/Mid-Range

Forward Facing

AEB, ACC, FCW

Mid-Range/Short-Range

Multi-mode “Corner Radar” Long Range/Mid-Range

Rear Collision Avoidance

Complementing AEB)

Copyright © 2016 2016 NXP Semiconductor 10

Car’s Eye View: LiDAR (Fixed Beam)

Mid-Range

AEB, FCW

Cross traffic

Blind spot

Rear collision

avoidance

Copyright © 2016 2016 NXP Semiconductor 11

The Full Sensor Suite for Autonomous

Copyright © 2016 2016 NXP Semiconductor 12

Game of “King of the Hill”

Market Acceptance

Detection Capability

Cost

Low Cost

Low Det’n

High Det’n

High Cost

Low Cost

Med. Det’n Med Cost

Med Det’n

Med Cost

Med Det’n

Med Cost

High Det’n

Low Cost

High Det’n

Copyright © 2016 2016 NXP Semiconductor 13

Game of “King of the Hill”

Detection Capability

Cost

Market Acceptance

High Detection

Decreasing Cost

Low Cost

Increasing Detection

Low Cost

Low Det’n

High Det’n

High Cost

Low Cost

High Det’n

Market winners move

“up the hill” Market losers move

“down the hill”

Copyright © 2016 2016 NXP Semiconductor 14

Climbing the Autonomous Vehicle Hill

Detection Capability

Cost

Market Acceptance

Vision: FF-DAS

Multi-Function

Scanning LiDAR

Radar

Vision: Park-Assist

Fixed-Beam LiDAR

Copyright © 2016 2016 NXP Semiconductor 15

Sensor Network for Fusion

Fusion

ECU FF ADAS

Camera

LiDAR

Surround-View Cameras

Side-facing LiDAR

Design Challenge: Partitioning of

processing and interconnect selection

Copyright © 2016 2016 NXP Semiconductor 16

• Each level assesses

associations from prior

level

• Bandwidth from each level

to the next depends on

sensor type

• Partitioning of processing

will vary by Sensor

Fusion Processing and Partitioning

Level 0: Feature Assessment

Level 1: Object Assessment

Level 2: Situation Assessment

Level 4: Process Refinement

Level 3: Impact Assessment

Resources

Signals/Features

Measurements

Objects

Situations

Situations/Plans

Situations/Plans Plans

Plans

Situations

Objects

Signals/Features

Copyright © 2016 2016 NXP Semiconductor 17

• ADAS cameras may process

objects or even situations

• Surround-view cameras

typically send raw data

• Radar and LiDar typically

processing to features (L0),

• Radar extends to objects (L1)

• Ideally process all levels in

ECU, not always possible

System Partitioning Example

ADAS

Camera

Radar LiDAR

Levels 3 & 4

Level 2

Level 1

Level 0

Fusion ECU

Surround

View

Camera

Copyright © 2016 2016 NXP Semiconductor 18

• No single sensing technology will provide complete information coverage

• Cameras, Radar, LiDAR (and ultrasound!) will all be utilized and fusion

processing elaborates knowledge from the full range of sensing data

• Key challenge: how to partition fusion processing within the constraints of

• Bandwidth / processing capability

• Getting the most reliable impact assessments

• NXP enables autonomous driving with front-end sensors, radar, vision and LiDAR

processing, interconnect technology, fusion processing and vehicle (V2X) comms

Summary