sensor fusion for adas and autonomous vehicles

22
From Technologies to Markets © 2021 From Technologies to Markets Sensor fusion for ADAS and autonomous vehicles February 10 th , 2021

Upload: others

Post on 22-Apr-2022

11 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Sensor fusion for ADAS and autonomous vehicles

From Technologies to Markets

© 2021

From Technologies to Markets

Sensor fusion for ADAS and

autonomous vehicles

February 10th , 2021

Page 2: Sensor fusion for ADAS and autonomous vehicles

22

1. Sensors in ADAS vehicles

2. Status of radars, cameras and LiDARs

3. The transformation of E/E architecture

4. Sensor fusion

5. Market forecasts for ADAS vehicles sensors

AGENDA

Edge AI | Presentation | www.yole.fr | ©2021

Page 3: Sensor fusion for ADAS and autonomous vehicles

3

SENSORS IN ADAS VEHICLES

Volume and penetration rate of AEB systems

Edge AI | Presentation | www.yole.fr | ©2021

Actual AEB systems are based on camera and radar. Its penetration rate keeps increasing from 18% in 2017 to 38% in 2019.

Forward

camera

Long

Range

Radar

AEB systems based on:

Page 4: Sensor fusion for ADAS and autonomous vehicles

4

SENSORS IN ADAS VEHICLES

AEB is still perfectible

• In October 2019, the American Automobile Association (AAA) conducted a series of tests using vehicles with AEBand pedestrian detection alerts on a closed course with dummy pedestrians.

• Several tests have been made at different speeds, mixing adult and children ‘pedestrians’ during daylight and nightconditions.

• Tests were performed on: Chevy Malibu, Honda Accord,Tesla Model 3 and Toyota Camry.

Edge AI | Presentation | www.yole.fr | ©2021

Cars with high-tech safety systems are still really bad at not running people over.

60% of the time, the vehicle struck the adult

crossing the road during daylight at 20 mph.

89% of the time, the vehicle struck the child

crossing the road during daylight at 20 mph.

When encountering an adult pedestrian at

night, these systems were ineffective.

100% of the time, the vehicle struck the adult

after a right turn at 15 mph.

A dummy pedestrian struck by a vehicle during the test.

Source: AAA

New sensors like LiDAR or thermal cameras will

need to be implemented

Page 5: Sensor fusion for ADAS and autonomous vehicles

5

RADAR TRENDS

The road to high resolution

Edge AI | Presentation | www.yole.fr | ©2021

Leveraging military and telecom technologies, advanced radar technology fully supports up to level-4 autonomy, with high resolution and radar imaging reaching the market with low C-SWAP.

X,Y,Doppler

X,Y,Z,Doppler

High resolution

X,Y,Z,Doppler,depth

X,Y,Doppler

Ultra-high resolution

X,Y,Z,Doppler,depth

2D ADAS

basic

2D ADAS

improved

3D

4D HR

4D UHR

X,Y,Z,Doppler,depth

AI/Deep Learning

Imaging

Resolution, classification, and object-tracking capability

Ch

an

nel n

um

ber/

rad

ar

ap

ert

ure

LRR 4th gen

ARS 5th gen

WARLORD TM

C-SWAP: cost, size, weight, and power

Digital beam-forming and MIMO technologies

Analog beam-forming and metamaterials

Icon Ultres/Phoenix

Yole Développement © April 2020

2019

From

2021

2000

2010

Page 6: Sensor fusion for ADAS and autonomous vehicles

6

THE PROLIFERATION OF CAMERA MODULES

The ‘Ten-plus cameras per car’ roadmap

The market is driven by camera proliferation.

2020e2012 2014 2016 2018 2022e

# of Cameras

1 camera

3 cameras

6 cameras

9 cameras

Yole Développement © April 2020

2024e

Surround-

view is the

new must-

have feature

Standard car

with forward

and rear

cameras

Multiple forward ADAS

cameras, night vision, e-

mirror replacement,

and driver monitoring

are next to be

introduced

2 cameras

5 cameras

9 cameras

12 cameras

Increasing demand

for ADAS will bring

more cameras per

car

Forward ADAS

Surround-view

e-Mirror

Driver monitoring

Night vision

Rear view

11 cameras

Edge AI | Presentation | www.yole.fr | ©2021

Page 7: Sensor fusion for ADAS and autonomous vehicles

7

THE EMERGENCE OF LIDAR

Adoption of LiDAR in ADAS vehicles

LiDAR is penetrating the ADAS market in the F-segment. As the price decreases, it will be implemented in lower segments.

2020

2025

2032

70k vehicles equipped with LiDAR

2.4M vehicles equipped with LiDAR

11M vehicles equipped with LiDAR

LR: $570

SR/MR: N/A

LiDAR ASP

LR: $500

SR/MR: $122

LR: $380

SR/MR: $99

0.1% of total vehicles

2.3% of total vehicles

11% of total vehicles

Edge AI | Presentation | www.yole.fr | ©2021

Page 8: Sensor fusion for ADAS and autonomous vehicles

88Edge AI | Presentation | www.yole.fr | ©2021

THE EMERGENCE OF LIDAR

ADAS LiDAR roadmap

2017 2020 2025 2030+

Only available in Audi A8 and

A7. Valeo is the only tier-1

having automotive grade at this

time. The typical use case is for

traffic-jam assistance.

LiDAR is introduced by more car

manufacturers and now include short/mid-

range LiDAR along with long-rangeLiDAR.

In this full automation configuration, the

vehicle can perform all driving functions

under all conditions. The driver may have the

option to control the vehicle.

New technologies are introduced in LiDAR

including SiPM and FMCW

New LiDAR technologies now include MEMS,

fiber laser, and SPAD array for Flash LiDAR.

FMCW: Frequency-modulated continuous-wave

SiPM: Silicon photo multiplier SPAD: Single photon avalanche diode

2

35

1 1 11

4

x Number of LiDAR embedded

Page 9: Sensor fusion for ADAS and autonomous vehicles

9

THE ADDITION OF SENSORS

Impact of adding sensors in ADAS vehicles

Edge AI | Presentation | www.yole.fr | ©2021

The multiplication of sensors in vehicles directly impacts the complexity of electronic architecture.

Ultrasound

UltrasoundCameras

Cameras

Long-range radar

Short-range radar

Short-range radar

LiDAR

More ECUs

requested

More computing

power

More software

requested

More complex E/E

architecture

Page 10: Sensor fusion for ADAS and autonomous vehicles

10

THE TRANSFORMATION OF E/E ARCHITECTURE

E/E architecture evolution - Roadmap

• Today, OEMs are still using a distributed E/E architecture with roughly one ECU per function.

Edge AI | Presentation | www.yole.fr | ©2021

Development from a distributed architecture to a centralized architecture.

50-100M lines of code

Increasing software amount

Distributed architecture

Domain centralization

Vehicle centralization

2020

2025

2030-2035

Super -

computer

200 - 300M lines of code > 300M lines of code

‘One function for one ECU’ principle will evolve toward

the merging of some ECUs. Limited to a few functions.

Creation of specific domain controllers:

infotainment, comfort, powertrain, sensing, safety.

All domain

controllers

centralized into one

super-computer.

Yole Développement © April 2020

> 100 ECUs

30 – 60 ECUs

20 – 45 ECUs

Page 11: Sensor fusion for ADAS and autonomous vehicles

11

THE TRANSFORMATION OF E/E ARCHITECTURE

zFAS module from Audi – Teardown from S+C

Edge AI | Presentation | www.yole.fr | ©2021

This module has been developed to integrate data coming from radars, cameras, LiDARs and ultrasonic sensors.

Aptiv is the supplier of this module.

Nvidia TegraK1 SoC

• Image processing for parking

• Four cameras computed

• Driver monitoring MobilEye EYEQ3

• Front camera image processing

• AEB for cars and pedestrians

Cyclone V SoC

• Responsible for sensor fusion

• Pre-processing ultrasonic

sensors

• Internal gateway

MCU 32-bit Aurix

• Hosting different functions

• A-SIL D compliant

• Interface to the car architecture

Courtesy of System Plus Consulting

Page 12: Sensor fusion for ADAS and autonomous vehicles

1212

E/E ARCHITECTURE AND COMPUTING

Data fusion for automated driving

Lane Keeping Assist

ACC with Stop & Go

Blind Spot Monitoring

Parking Assist

High Beam Assist

Traffic Sign Recognition

20202030

AEB

Parking valet

Traffic Jam Pilot

Highway Pilot

And more functions

Data fusion

Lane Keeping Assist

ACC with Stop & Go

Blind Spot Monitoring

High Beam Assist

Traffic Sign Recognition

Parking Assist

Distributed or domain

centralized E/E architecture

Domain or vehicle centralized

E/E architecture

Thermal camera

Viewing camera

LiDAR

ADAS camera

Radar

Ultrasonic sensor

Edge AI | Presentation | www.yole.fr | ©2021

Page 13: Sensor fusion for ADAS and autonomous vehicles

13

SENSOR FUSION

What does a car need to see?

Edge AI | Presentation | www.yole.fr | ©2021

Objects, lanes and what can be crossed or not.

Lanes impossible to cross

Crossable lanes

Drivable area

Lane of the vehicle

Vehicle

Cyclist

Pedestrian

Page 14: Sensor fusion for ADAS and autonomous vehicles

14

SENSOR FUSION

Computing unit – ADAS system overview

Edge AI | Presentation | www.yole.fr | ©2021

As more and more sensors are used for ADAS, computing will become critical. Data processing and computing power will become differentiating parameters.

Sense Compute Actuate

Graphics

processor

Powertrain

domain

controller

Sensor

fusion

µC

Safety

domain

controller

Transmission

Engine

Braking

Steering

Yole Développement © April 2020

Page 15: Sensor fusion for ADAS and autonomous vehicles

15

SENSOR FUSION

The parallax issue

Edge AI | Presentation | www.yole.fr | ©2021

Calibration will be crucial to combine different perspectives.

When sensors are positioned apart from each other, they will

have a different perspective on the vehicle’s environment

(parallax) which may lead to a situation in which one sensor

can perceive an object, while it is occluded for the other

sensor.

Courtesy of Outsight

Page 16: Sensor fusion for ADAS and autonomous vehicles

16

SENSOR FUSION

Dream vs reality

Edge AI | Presentation | www.yole.fr | ©2021

Gathering all the sensors in one place is not possible in ADAS vehicles.

The ideal case

All the sensors are aligned and closed one to

the others. The parallax issue is limited, and

fusion of data is easier.

The reality: not enough space

In most of the times, it is difficult to integrate

more than two sensors in a place due to

several constraints.

Source: Paul-Henri Matha,

DVN LiDAR conference, 2019

Courtesy of System Plus Consulting

Page 17: Sensor fusion for ADAS and autonomous vehicles

17

SENSOR FUSION

The expected solution from Volvo

Edge AI | Presentation | www.yole.fr | ©2021

Main sensors for automated driving will be perfectly aligned.

For illustration purpose only

For illustration purpose only

Long-range LiDAR

Forward ADAS camera

Long-range radar

Iris LiDAR from Luminar

Page 18: Sensor fusion for ADAS and autonomous vehicles

1818Edge AI | Presentation | www.yole.fr | ©2021

SENSOR FUSION

Data fusion – two possibilities

LiDAR

ADAS camera

Radar

Fusion platform /

Domain controller

Late fusion using object

list coming from sensors

Traditional OEMs

LiDAR

ADAS camera

Radar

Fusion platform /

Domain controller

Early fusion using raw

data coming from sensors

New OEMs

Page 19: Sensor fusion for ADAS and autonomous vehicles

1919Edge AI | Presentation | www.yole.fr | ©2021

SENSOR FUSION

From late fusion to early fusion

From partial automation to full automation

Level 2 Level 3 Level 4 Level 5

• In defined use cases, the driver can transfer the driving task to the system.

• Side activities can be permitted.

• The driver has to take over within a specified time (level 3) or when he wants to leave the

domain (level 4).

The driver is assisted in

the driving task by the

system. The driver is not

allowed to do secondary

tasks and keeps focusing

on the road.

Late fusion Early fusion or both methods

Intermediate

level

This level is an intermediate level as OEMs decide in which conditions, automated driving

will be enabled.

Page 20: Sensor fusion for ADAS and autonomous vehicles

2020

• During last CES, Mobileye showed its vision towards 2025and the launch of consumer autonomous vehicles.

• Mobileye is advocating “internal redundancy” on a singlesensor modality, instead of promoting “low-levelredundancy” between different sensory modalities.

• A year ago, at CES, Mobileye disclosed its unique approachto internal redundancy by running different algorithms oncamera-only systems.

• This year, Mobileye is unveiling plans for a separatesubsystem consisting of a front-facing lidar plus a cocoonof radars performing the same level of safety accomplishedby the camera-only system.

• By fusing the two subsystems — each independentlydeveloped and validated — a robotaxi should be able toachieve “true redundancy,” claimed Mobileye

• For phase 2 sensor fusion in robocars, which starts in 2025,Mobileye will be ready with a home-grown lidar, by usingfrequency modulated continuous wave (FMCW)technology.

Edge AI | Presentation | www.yole.fr | ©2021

SENSOR FUSION

Mobileye roadmap to 2025

RSS: Responsibility-sensitive safety

Late fusion Early fusion

Page 21: Sensor fusion for ADAS and autonomous vehicles

21

MARKET FORECASTS

Overview of sensors and computing market revenue

A fast-growing market at a CAGR2020-2025

of 21%. $3.8B

2020$8.6B

2025$22.4B

LiDAR

Radar

Computing ADAS

Camera module

CAGR: 19%

CAGR: 18%

CAGR: 22%CAGR: 113%

$0.04B

$1.3B

$1.7B $3.5B$3.5B

$9.1B

$8.1B

Edge AI | Presentation | www.yole.fr | ©2021

Page 22: Sensor fusion for ADAS and autonomous vehicles

From Technologies to Markets

© 2021

From Technologies to Markets

Thank you for your attention!

[email protected]