accelerating automated vehicle acceptanceonlinepubs.trb.org/onlinepubs/webinars/200714.pdf ·...

Post on 04-Oct-2020

4 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

TRANSPORTATION RESEARCH BOARD

@NASEMTRB#TRBwebinar

Accelerating Automated Vehicle Acceptance

July 14, 2020

The Transportation Research Board

has met the standards and

requirements of the Registered

Continuing Education Providers

Program. Credit earned on completion

of this program will be reported to

RCEP. A certificate of completion will

be issued to participants that have

registered and attended the entire

session. As such, it does not include

content that may be deemed or

construed to be an approval or

endorsement by RCEP.

PDH Certification Information:

•2.0 Professional Development Hours (PDH) – see follow-up email for instructions•You must attend the entire webinar to be eligible to receive PDH credits•Questions? Contact Reggie Gillum at RGillum@nas.edu

#TRBwebinar

Learning Objectives

#TRBwebinar

1.Identify current AV practices

2.Discuss how data, policies, and trust impact the pace of automated system technologies

Are We There Yet?Building on TRB Advancing Automated Vehicle Adoption Workshop

Valerie ShumanPrincipal, SCG, LLC

https://connectedautomateddriving.eu/event/computers-wheels-whos-going-keep-track-driverless-vehicles/

Overview

• Key Questions• Roundtable Insights

Are We There Yet?

• What is an AV anyway & who sets that definition?

• Who consistently captures & reports this data for this population (the same way that NHTSA does for the driving public as a whole)?

• How do they do this?

• Roundtable Question: What AV metrics can we implement within 12 months (or sooner)?

https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812826

Tracking AV Capability/Performance Trends

• What are the key tasks and metrics (top 10? Top 20)?• How do we regularly review these metrics as an industry to ensure we’re

trending in the right direction and report progress?

• Roundtable Question: Propose a set of key driving tasks and metrics that we should be monitoring, and a national level solution for monitoring them

What Did We Learn?

• What is an AV?• AV is L3 or 4 and above• Initial focus should be metrics for ADAS and HAV systems (L1/L2)

• What types of metrics?• Focus on efficiency, safety, equity metrics• Look at scenario-based data and outcome metrics to understand status of overall

fleet• Develop metrics for each mode• Consider regional requirements like different levels of AV functionality (e.g., rural)• Need to look at crashes and near misses; collect data on what’s working and what’s

failing

Specific Metrics (1)

• Performance along a “familiar” route• Route performance and adjustments• Takeover time controls in various road conditions and situations• Interaction with local traffic “culture” – is the AV a good citizen?

• Consider overtaking distance (especially for bikes)

• Organizational Design Domains (ODDs)• How much driving is done in and outside of the ODD?• At L4, testing on all intended ODDs for a given road must be “green” before

can use that road• Moving object detection (including speed at which detection is made)

• Develop third party testing standards

Specific Metrics (2)

• Signal detection analysis for certain crash types in various scenarios• Functional testing• Secondary crashes

• Consider contributing factors/context• Develop a list of factors, design scenarios and test to understand crashes/mile

• Environmental data• Takeover requests (planned/unplanned)• Post crash what L1/L2 features were turned on (or not)?

• Was the driver aware of the feature?• Biometrics of person in the car. Is the person in a good state and can reengage? • Vehicle kinematics

How Do We Implement?

• Partner with insurance industry, OEMs, hospitals and public & private sector tracking

• Carefully consider model to encourage private sector sharing – anything too “regulatory” will be a challenge

• Develop nationally consistent/standardized metrics to allow data-sharing and confidence

• Beware of unintended consequences from metrics (incentivize proper design targets)

In Summary

• There is a lot of nuance to consider

• Making choices is going to be tough

• Trust is less important than trustworthiness

https://www.automatedvehiclessymposium.org/register

Trusting Increasingly AutonomousVehicle Technology

John D. Lee

University of Wisconsin—Madison jdlee@engr.wisc.edu

Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46(1), 50–80.Lee, J. D., & Kolodge, K. (2019). Exploring trust in self-driving vehicles with text analysis. Human Factors.

doi:10.1177/0018720819872672Lee, J. D. (2020). Trust in automated, intelligent, and connected vehicles. In D. L. Fisher, W. J. Horrey, J. D. Lee, & M. A.

Regan (Eds.), Handbook of Human Factors for Automated, Connected, and Intelligent Vehicles. CRC Press.Chiou, E. K. & Lee, J. D. (in review). Trusting automated agents: Designing for appropriate cooperation. Human Factors.

Transportation as multi-echelon network with many trust relationships

• Driver-Automated vehicle trust

• Person-Policymaker trust

Policy, Standards, andSocietal Infrastructure

Remote Infrastructureand Tra�c

Negotiated RoadSituations

Pedestrian trust in AV

Remote operator trust in AV

Driving functions andactivities

Vehicle Sensors andControls

Engineers trust in sensors

Driver trust in sensor system

NTSB Finds Overreliance in Tesla Crash

NTSB Highway Accident Report

7

trim from the car was found entangled within the forward-most area of contact damage on the semitrailer. Figure 6 shows a postcrash photograph of the semitrailer, and figure 7 focuses on the damage to the semitrailer.

Figure 6. Damaged right side of the Utility semitrailer.

Figure 7. Closeup view of impact damage to the right side of the Utility semitrailer. The arrow indicates a segment of front windshield trim from the Tesla entrapped in the forward-most area of damage.

NTSB Highway Accident Report

15

Figure 11. Chart showing how much time during the 41-minute crash trip that, while Autopilot was active, the driver had his hands on the steering wheel. Visual and auditory warnings are also indicated. (Timing provided is based on vehicle data and is approximate and relative.)

System Performance Data. The vehicle performance data revealed the following:

x The crash-involved Tesla’s last trip began at 3:55:23 p.m. The car was stopped ornearly stopped about 4:19 p.m. and again about 4:30 p.m. The collision with the truckoccurred at 4:36:12.7, as indicated by fault codes and system disruptions.

x The last driver input before the crash was to increase the TACC speed to 74 mph at4:34:21, which was 1 minute 51 seconds before the crash. After that input, there wasno driver interaction with Autopilot, no change in steering angle, and no brake lampswitch activation until the collision.

x During the last trip, TACC detected a vehicle ahead of the Tesla seven times. For thefinal 1 minute 35 seconds preceding the crash, TACC detected no lead vehicle in frontof the Tesla.

x About 9.7 seconds before the collision, the motor torque demand steadily decreased(indicating that the vehicle was on a descending grade). The reported torque demanddropped to zero at the time of the first fault report.

x No brakes were applied before or during the collision.

x Vehicle headlights were not on at the time of the collision.

x The driver was wearing his seat belt during the trip.

x Throughout the approach to the collision with the truck, the electronic power assiststeering exhibited no substantial changes in steering angle.

x There was no record indicating that the Tesla’s automation system identified the truckthat was crossing in the car’s path or that it recognized the impending crash. Becausethe system did not detect the combination vehicle—either as a moving hazard or as astationary object—Autopilot did not reduce the vehicle’s speed, the FCW did not

Trust Calibration is Critical

Trust

Trustworthiness

Overtrust

Undertrust

Topic modeling of ~10k responses to JDPower TechChoicesurvey

Technology improving

Tested for long time

Works good?

Selfdriving accidents

Trust when mature

Hacking & glitches

Errors & failures

Many things go wrongScary drivers and robots

Control until proven

Safer than human

Computers make mistakes

Feel uncomfortable

Lee, J. D., & Kolodge, K. (2019). Exploring trust in self-driving vehicles with text analysis. Human Factors, in review.

Psychophysics of Dread Risk Undermine Trust

• Dread risk perceived as 1000 times greater than controlled risk (Slovic, P. (1987). Perception of risk. Science, 236(4799), 280–285)

• Dread risk guides society to risky outcomes(Gigerenzer, G. (2004). Dread risk, September 11, and fatal traffic accidents. Psychological Science, 15(4))

• Trust declines with surprisingly poor behavior: Ease missesMadhavan, P., Weigmann, D. A., & Lacson, F. C. (2006). Automation failures on tasks easily performed by operators undermine trust in automated aids. Human Factors, 48(2), 241–256

Evtimov, I., Eykholt, K., Fernandes, E., Kohno, T., Li, B., Prakash, A., Song, D. (2017). Robust physical-world attacks on machine learning models.

Small changes produced 100% misclassification

Help People See Benefits

Societal

Relational

Experiential

Basis of Trust Dimensions of Trust

Purpose–Betrayal

Process–Violation

Performance–Disappointment

Trust

Acceptance

Perceived Risk

Sense of Control

1

Trusting Vehicle Technology

Lee, J. D. (2020). Trust in automated, intelligent, and connected vehicles. In D. L. Fisher, W. J. Horrey, J. D. Lee, & M. A. Regan (Eds.), Handbook of Human Factors for Automated, Connected, and Intelligent Vehicles. CRC Press.

Trusting in Increasingly Autonomous Vehicles

• Trusted and trustworthy technology Calibrated with capability and aligned with goals

• Trust in multi-echelon networksTrusters beyond the direct users to include other road usersTrust basis beyond vehicle sensors to include policy makers

• Trust based on societal, relational, and experiential factors Avoid dread risk with transparency and control

John D. LeeUniversity of Wisconsinjdlee@engr.wisc.eduTwitter: jdlee888

Data for AV Integration

ARIEL GOLD

DATA PROGRAM MANAGER

U.S. DEPARTMENT OF TRANSPORTATION (USDOT)

INTELLIGENT TRANSPORTATION SYSTEMS (ITS) JOINT PROGRAM OFFICE (JPO)

JULY 14, 2020

Accelerating Automated Vehicle Acceptance

https://www.transportation.gov/av/data/wzdx

AV 4.0 & Data• Builds upon AV 3.0 by expanding

the scope to 38 relevant United States Government (USG) components

• Highlight crosscutting data-related items: • Privacy and data security• Consistent standards and policies• Multipronged approach to

advance AI• Connectivity and data exchanges

• Features efforts aimed at enabling voluntary data exchanges Source: USDOT https://www.transportation.gov/av/4

https://www.transportation.gov/av/data/wzdx

U.S. DOT’s Data for AV Integration (DAVI) Initiative

https://www.transportation.gov/av/data

Source: USDOT https://www.transportation.gov/av/data

https://www.transportation.gov/av/data/wzdx

DAVI Website

4

DAVI Overview Guiding Principles DAVI Framework

Source: USDOT https://www.transportation.gov/av/data

https://www.transportation.gov/av/data/wzdx

DAVI Framework

5Source: USDOT https://www.transportation.gov/av/data

https://www.transportation.gov/av/data/wzdxhttps://www.transportation.gov/av/data 6

The Work Zone Data eXchange (WZDx)

Source: Work Zone Data Working Group https://github.com/usdot-jpo-ode/jpo-wzdx

https://www.transportation.gov/av/data/wzdx

WZDx Demonstration Grants

• Total funding: $2.4M • Number of Awards: Up to 12 • Potential Award Amounts: Up to $200,000 each • Period of performance: 12 months• Cost Share: 20% Non-Federal Share • Federal involvement: Performance monitoring, technical guidance,

and participation in status meetings, workshops, and technical group discussions.

https://www.grants.gov/web/grants/view-opportunity.html?oppId=327731

Source: ITS JPO

https://www.transportation.gov/av/data/wzdx

Utilizing Common Work Zone Event Data for V2x and Cooperative ADS Applications

8

IOO Work Zone Field Data Collection

Types of content needs

Road Network

Road Furniture

Dynamic Environment Data

Driving Task Content Needs

Driving Task Questions

Where am I relative to my environment?

What are the rules of the road that affect path?

What’s changed from what I already know? Road Usage

Restrictions

Signal and VMS status/translation

Speed Limit Changes

Geometry Changes

Work Zones

Lane Closures

Signal LocationsData Fusion & Decision

Develop tools to collect spatial data from the field to support work zone data collection

Develop software translators for V2X and

cooperative ADS applications

Improved Data Specifications & Tools

WZDI Program

CARMA3

WZDx Specification

Identify improvements to spatial data

elements for work zone events

Source: FHWA

https://www.transportation.gov/av/data/wzdx

Cooperative ADS as a Component of Work Zone Event Data

9

• TMC Operator/IOOs enter basic information about work zone.

• Data is consistent with WZDx spec v2.0.

• Data collection automatically starts/ends when set starting/ending locations are reached.

• User interface to select current state of road/work zone.

Copyright: https://github.com/TonyEnglish/V2X-manual-data-collection

https://www.transportation.gov/av/data/wzdx

Cooperative ADS as a Component of Work Zone Event Data

10

• Received information is used to generate a work zone with new geospatial details in the back office (cloud) for validation.

• Overlay WZ Map information.

• TMC Operator verifies accuracy of recorded work zone.

• TMC Operator publish verified work zones available for 3rd parties, 511, etc.

• Repository available at https://github.com/TonyEnglish/V2X-manual-data-collection

Copyright https://github.com/TonyEnglish/V2X-manual-data-collection

https://www.transportation.gov/av/data/wzdx

• Many outside the federal government are contributing open training data sets that assist with computer vision and other core Automated Driving System (ADS) functions

• Open training data sets:• BDD100K from the University of California

at Berkeley• Waymo Open Dataset• Lyft Level 5 Dataset• Audi AEV Autonomous Driving Dataset• Ford Autonomous Vehicle Dataset

• Contact avdx@dot.gov to share other examples of open training data sets

11

Open Training Data Sets

Source: University of California at Berkeley (https://bdd-data.berkeley.edu/)

https://www.transportation.gov/av/data/wzdx

For More Information

Ariel GoldData Program Manager

U.S. Department of Transportation

Intelligent Transportation Systems Joint Program Office

Ariel.Gold@dot.gov

Twitter: @ITSJPODirector

Website: www.its.dot.gov

Facebook: www.facebook.com/DOTRITA

iihs.org

ADAS Ratings for Consumer Information ProgramAccelerating Automated Vehicle Acceptance

David HarkeyInsurance Institute for Highway Safety

TRB WebinarJuly 14, 2020

IIHS consumer ratings

4200+ 2019 models rated

410 evaluations per model

4460 new ratings in 2019

2017Small overlap front:

passenger-side

2012Small overlap front:

driver-side

Roof strength2009

2004Rear

(whiplash mitigation)

Side impact2003

1995Moderate overlapfront

IIHS crashtesting programs

Effect of crash avoidance systems on claim frequencyResults pooled across automakers

-40%

-20%

0%

20%

40%

forward collisionwarning

frontautobrake

curve-adaptiveheadlights

lane departurewarning

blind spotwarning

parkingsensors

rearcamera

rearautobrake

Collision Property damage liability Bodily injury liability MedPay PIP

Most crash avoidance technologies are living up to expectationsEffects on relevant police-reported crash types

-80%

-60%

-40%

-20%

0%

20%

forward collision warning low-speed autobrake FCW with autobrake lane departure warning side-view assist(blind spot)

all severities injury statistically significant

2020 TOP SAFETY PICK requirementsGood ratings in the driver-side small overlap front, passenger side small overlap front, moderate overlap front, side, roof strength and head restraint tests

Advanced or superior rating for pedestrian AEB as Optional equipment

Good or acceptable headlight as Standard equipment

Advance or superior rating for front crash prevention as Optional equipment

G A

G

Good or acceptable headlight as Optional equipment

Good ratings in the driver-side small overlap front, passenger side small overlap front, moderate overlap front, side, roof strength and head restraint tests

Advanced or superior rating for pedestrian AEB as Optional equipment

Advanced or superior rating for front crash prevention as Optional equipment

G

G A

Volvo S60(2 points advanced)

Dodge Durango(3 points advanced

Subaru Outback(6 points superior)

12 mph test(speed reduction)

12 mph 6 mph 12 mph

24 mph test(speed reduction)

2 mph 9 mph 12 mph

ü ü

ü

Speed reduction in 12 and 24 mph tests

Front crash prevention: vehicle-to-vehicle ratings2013 – 20 models, as of July 2020

0%

20%

40%

60%

80%

100%

2013 2014 2015 2016 2017 2018 2019 2020

Pedestrian test scenarios

Pedestrian crash prevention ratingsAs of November 2019

0

2

4

6

8

Superior Advanced Basic No credit

small SUVs

midsize sedans

Adult walking from the right side25 mph condition

Adult walking from the right side25 mph condition

Adult walking from the right side25 mph condition

Adult walking from the right side25 mph condition

Child running from the right side12 mph condition

Child running from the right side12 mph condition

Child running from the right side12 mph condition

Child running from the right side12 mph condition

Rear crash prevention ratings4 Rear parking sensors

4 Rear cross traffic alert

4 Rear autobrake

reversing car-to-car, 16” overlap reversing car-to-car, 45° angle

reversing car-to-car, 10° angle reversing toward fixed pole

Functional performance and user experience

2017 BMW 5 serieswith Driving

Assistant Plus

2017 Mercedes E-Classwith Drive Pilot

2016 Tesla Model Swith Autopilot

software ver. 7.1

2018 Volvo S90with Pilot Assist

2018 Tesla Model 3with Autopilot

software ver. 8.1

Lane keeping in curves

0%

20%

40%

60%

80%

100%

BMW 5 seriesn=16

Volvo S90n=17

Mercedes E-Classn=17

Tesla Model Sn=18

Tesla Model 3n=18

disengaged

crossed lane line

on lane line

remained in lane

Lane keeping on hills

0%

20%

40%

60%

80%

100%

BMW 5 seriesn=14

Tesla Model Sn=18

Volvo S90 n=17

Mercedes E-Classn=18

Tesla Model 3n=18

disengaged

crossed lane line

on lane line

remained in lane

Adaptive cruise control trusted more than active lane keepingPercentage of drivers who agreed or strongly agreed

0

20

40

60

80

100

Tesla Model SAutopilot

Volvo S90Pilot Assist

BMW 5 seriesDriving Assistant Plus

Infiniti QX50ProPilot Assist

Mercedes E-ClassDrive Pilot

I trust the automation to maintain speed and distance to vehicle ahead

I trust the automation to keep me in center of lane

IIHS consumer information does more…Since 1995

4Empower consumers

4Identify gaps in safety regulations and testing programs

4Encourage automakers

4Accelerate technology integration!

Thank you

Today’s Panelists• Moderator: Cynthia Jones, DriveOhio• John Lee, University of Wisconsin-

Madison• Valerie Shuman, Shuman Consulting

Group• Ariel Gold, US DOT• David Harkey, Insurance Institute for

Highway Safety

Get Involved with TRB

#TRBwebinarReceive emails about upcoming TRB webinarshttps://bit.ly/TRBemails

Find upcoming conferenceshttp://www.trb.org/Calendar

Get Involved with TRB

Be a Friend of a Committee bit.ly/TRBcommittees– Networking opportunities

– May provide a path to Standing Committee membership

Join a Standing Committee bit.ly/TRBstandingcommittee

Work with CRP https://bit.ly/TRB-crp

Update your information www.mytrb.org

#TRBwebinar

Getting involved is free!

#TRB100

top related