automotive cybersecurity challenges for automated vehicles: jonathan petit

18
CYBERSECURITY CHALLENGES FOR AUTOMATED VEHICLES Jonathan Petit [email protected]

Upload: security-innovation

Post on 23-Jan-2018

533 views

Category:

Automotive


0 download

TRANSCRIPT

Page 1: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

CYBERSECURITY CHALLENGES FOR

AUTOMATED VEHICLESJonathan Petit

[email protected]

Page 2: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 2

Technical Challenges

Governance Challenges

Page 3: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 3

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Page 4: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 3

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.

credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

On-Board Unit

Datastorage

Page 5: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016

Vehicle

3

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.

credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Cloudservices

On-Board Unit

Datastorage

Page 6: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016

Vehicle

3

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.

credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Cloudservices

On-Board Unit

Datastorage

Infrastructure

Page 7: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016

Vehicle

3

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.

credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Cloudservices

On-Board Unit

Datastorage

Infrastructure

Constraints • Cost (money)• Computation overhead• Space• Energy• Communication overhead

Page 8: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016

Vehicle

3

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.

credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Cloudservices

On-Board Unit

Datastorage Challenges • HW security• SW security• OS security• Network security• Privacy

Infrastructure

Constraints • Cost (money)• Computation overhead• Space• Energy• Communication overhead

Page 9: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 4

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Cloudservices

Datastorage

Challenges• Pen-testing: assess security level / “score”• Protection: learn from military applications• Software/Firmware OTA update is critical• Can the sensors play an active role in the defense

system?

Page 10: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 5

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Cloudservices

Datastorage

• Processing: filtering, normalization• Sometimes integrated to the sensor itself,

otherwise data sent on the bus

Challenges • HW-based security, host-based security

with minimal overhead• Side-channel protection• Middleware security• Sensor authentication• In-vehicle network security

Page 11: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 6

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Cloudservices

Datastorage

Challenges• Often performed on fast

CPU/GPU g HW security• Proposal: use “sensor

security score”• How to harden adversarial

input • How to differentiate attacks

from faulty sensor?• How to obfuscate

“weights”?

Page 12: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 7

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.

Cloudservices

credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Datastorage

External data• V2X• Maps• Cloud service

Challenges• How much trust in external

data?• Consensus in V2X network• Maps integrity and freshness• Secure cloud services

Page 13: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 8

Technical Challenges

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.

Cloudservices

credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

DatastorageChallenges• Decide what is best for the ego

vehicle and the community (game theory)

• Build a context-aware misbehavior engine (prediction, detection, reaction)

Page 14: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 9

Technical Challenges

Governance Challenges

Page 15: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 10

Governance Challenges

Encourage security by design (upgradability, crypto-agility), security testing (C/I/A analysis, security rating)

Data ownership will affect privacy and security

Forensicsability

Role of road operators

Page 16: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 11

Scalable electronics driving autonomous vehicle technologies 5 April 2014

to-infrastructure (V2I) communications. On-board

maps and associated cloud-based systems offer

additional inputs via cellular communications.

The outputs from all the sensor blocks are used to

produce a 3D map of the environment around the

vehicle. The map includes curbs and lane markers,

vehicles, pedestrians, street signs and traffic lights,

the car’s position in a larger map of the area and

other items that must be recognized for safe driving.

This information is used by an “action engine,” which

serves as the decision maker for the entire system.

The action engine determines what the car needs

to do and sends activation signals to the lock-step,

dual-core MCUs controlling the car’s mechanical

functions and messages to the driver. Other inputs

come from sensors within the car that monitor the

state of the driver, in case there is a need for an

emergency override of the rest of the system.

Finally, it is important to inform the driver

visually about what the car “understands” of its

environment. Displays that help the driver visualize

the car and its environment can warn about road

conditions and play a role in gaining acceptance of

new technology. For instance, when drivers can see

a 3D map that the vehicle uses for its operations,

they will become more confident about the vehicle’s

automated control, and begin to rely on it.

Algorithms and system scaling

With its heavy reliance on cameras, radar, lidar and

other sensors, autonomous vehicle control requires

a great deal of high-performance processing, which

by nature is heterogeneous. Low-level sensor

processing, which handles massive amounts of

input data, tends to use relatively simple repetitive

algorithms that operate in parallel. High-level

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

Autonomous vehicle platform: a functional diagram

Figure 1. A functional view of the data flow in an autonomous car’s sensing and control system.credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Cloudservices

1. Pen-testingsensors andharden them

2. HW/Host-based security

3. Use“security level”

as weight

4. Secureexternal

(contextual) data

5. Misbehavior Detection

Page 17: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

J. Petit - Automated Vehicles Symposium - July 20, 2016 11

Cameras

Radars

Sensor Processing

Sensor Processing

Sensor Fusion

3D Scanning Lidars

Ultrasoundsensors

Sensor Processing

Sensor Processing

Action Engine

Vehicle Controls- Brake/acc- Steering- etc.

Visualization/DisplaySub-system

Raw data Object parameters- Time stamp- Dimensions- Position/velocity

3D Map Actions- Do nothing- Warn- Complement- Control

Compressed data

V2V / V2Icomm.

Sense Understand Act

GPSIMS

“Maps”a priori info

Driver state

credit: F. Mujica. Scalable electronics driving autonomous vehicle technologies. Technical report, Autonomous Vehicles R&D, Kilby Labs, Texas Instruments, 2014.

Cloudservices

1. Pen-testing sensors and harden them

2. HW/Host-based security

3. Use“security level”

as weight

4. Secureexternal

(contextual) data

5. Misbehavior Detection

Plan for Actions

1. Organize workshops with subject matter experts to establish cybersecurity guidelines for AV

2. Address technical challenges: HW, SW, system,network, privacy AND build a full automationsystem simulator (open source)

3. Address governance challenges: policies, forensics, data ownership, data sharing, security rating

Page 18: Automotive Cybersecurity Challenges for Automated Vehicles: Jonathan Petit

Questions & AnswersJonathan Petit

[email protected]

For more info, check my ESCAR US 2016 presentation and publications list!

Security Innovation Automotive Offerings:• Aerolink Security for Vehicles• Automotive Training Courses• Automotive Consulting Learn more about our automotive expertise!