intelligent autonomy for reducing operator workload v2 · 2020-04-04 · intelligent systems •...

Post on 20-May-2020

7 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

ARLPenn State

Intelligent Autonomy for Reducing Operator Workload

Mark RothgebIntelligent Control Systems Department

Autonomous Control and Intelligent Systems Division

April 10, 2007

ARLPenn State

Applied Research Laboratory background

Autonomous unmanned vehicles (ARL / DoD)

Issues in automation and levels of autonomy

Two examples of reducing operator workload via increasing levelsof system autonomy

Overview

ARLPenn State

Navy UARC’s established in mid-1940’s to continue University centered R&D effective during WWII

We offer a diverse portfolio of systems expertise and technologies applicable to Distributed Systems

UARC Universities maintain a long-term strategic relationship with the Navy

Characteristics:– Can address evolving needs with enabling technologies– Understanding of operational problems and environment– Objectivity and independence– Corporate knowledge and memory– From concept to prototype (integration and test facilities)– Freedom from conflict of interest

Applied Research LaboratoryNavy UARC Background

ARLPenn State

Core Technologies

• Fluid Dynamics• Hydro Acoustics• Computational Mechanics• Composite Materials• Information Fusion and Visualization• Energy and Power Systems• System Simulation• Autonomous Control and Intelligent

Systems

ARLPenn State

ARL Full-Time Equivalent Years

Systems Engineering OrientationBasic Research thru Demonstration to Full-Scale Implementation Project Management of Cross-disciplinary, Multi-performer Teams

Characteristics and Size

ARL Part of Penn State Research

ARLPenn State ARL Locations

APPLIED RESEARCH LABORATORY BUILDINGAPPLIED RESEARCH LABORATORY BUILDING

APPLIED SCIENCE BUILDINGAPPLIED SCIENCE BUILDING

NAVIGATION RESEARCH & DEVELOPMENT CENTERNAVIGATION RESEARCH & DEVELOPMENT CENTER ARL CATO PARKARL CATO PARK

GARFIELD THOMAS WATER TUNNELGARFIELD THOMAS WATER TUNNEL

ELECTRO-OPTICS SCIENCE & TECHNOLOGY CENTERELECTRO-OPTICS SCIENCE & TECHNOLOGY CENTER

Keyport Naval FacilityKeyport, Wa.

Distributed Engineering CenterPenn State Fayette Campus

Washington Office

Washington, DC

ARL HawaiiPearl Harbor, Hi.

Electro-Optics CenterKittanning, Pa. ARL Penn State

State College, Pa.

Navigation Research & Development CenterWarminster, Pa.

ARLPenn State

PSU/ARL Experimental UGV

• Embedded Health Monitoring• Autonomous Navigation & Control• Intelligent Self-Situational Awareness• COTS OCU Development• JAUS Development & Testing

ARLPenn State

UAV

PSU Aero/ARL UAV Base Aircraft

4 channels,5 servosRadio

.40-.46 2 strokeor .91 4-strokeEngine

6 to 6 1/2 poundsWeight

64 3/4 inchesLength

1180 sq. inchesWing Area

80 inchesWingspan

Specifications

TRAINER

ARLPenn State

OBJECTIVEDeveloped a rapid prototype AUV for use in collection of oceanographic environmental data

VEHICLE FEATURESLong-range capabilities (>300 nm @ 4 kts)Fully autonomous vehicle operationsLaunch/recovery from TAGS 60 platformSensors: Sidescan Sonar, Acoustic Doppler Current Profilers (ADCP), Conductivity, Temperature, and Depth (CTD)Simple maintenance & turnaround at sea

OCEANOGRAPHIC DATA GATHERING

Diameter:38 in.

Length:27 ft.-10 in.

Weight:9,900 lbs.

Endurance:300 nautical miles

Payload:dual side-scan sonars; other oceanographic instruments

Navigational Accuracy: better than 150 meters

PSU/ARL Autonomous Undersea Vehicle

ARLPenn State

DoD Autonomous Vehicles

• Predator• Firescout

• Battlespace Preparation AUV

ARLPenn State

NASA Dart Autonomous Operation

• Even basic automation concepts … not so simple

…On April 15, more than 450 miles above Earth, an experimental NASA spacecraft called DART (Demonstration of Autonomous Rendezvous Technology) fired its thrusters and closed in on a deactivated U.S. military communications satellite—and then gently bumped into it. (Popular Science 2005)

• Rendevous and Inspection• Proximity Operations

ARLPenn State

Automation Perspectives

• Underwater Vehicles– Communication issues– Load ‘n Go– Automation Manual

• Ground (UGV), Air (UAV), Surface (SUV) Vehicles– Remote control / Tele-operation (fly-by-wire)– Human in control with bits of automation (waypoints)– Manual Automation

• Spacecraft Vehicles– Ground-Control driven– Backoff and “Safe” the system (valuable assets)– Solve the problem on the ground through analysis

ARLPenn State

Operator Overload Forcing Automation

• Operator overload comes in different ways– Increase in number of tasks for the same number of

people• Can’t add crew, but now have more sensors

– Reduce head-count for same number of tasks• Littoral Combat Ship (LCS)

– Increased complexity of tasks forces automation• Surface vehicle on open ocean, surface vehicle in harbor

– Increase in amount of data to process• Need to react quickly also forces automation

– Systems that automatically respond because of timing requirements

– Advisory systems that call the operator to attention

ARLPenn State

Acceptance of Automation

What is required for acceptance of automation…Its all about gaining trust….

• Don’t do something fundamentally wrong (run into the wall)• Don’t do something non-intuitive (go right around wall versus left)• Do tell the operator when the autonomy doesn’t know what to do

– Ambiguous circumstances– Able to solve the 95% case but not the 5%

• Do give insight into decision-making• Do have automation assist the operator, not vice versa

– Microsoft word helps you?– Mapquest fixes for example (beltway anomoly)– Employee Reimbursement System (cure worse than ailment?)

• Do let the operator dynamically alter the level of autonomy– Full manual Full autonomy

ARLPenn State

Levels of Autonomy

• Various groups have defined levels of autonomy– National Institute of Standards (NIST)– Future Combat Systems (FCS)– Air Force Research Laboratory (AFRL)– Uninhabited Combat Air Vehicle– ASTM Committee on Unmanned Undersea Vehicle

Systems (UUV) – NASA FLOOAT (Function-specific Level of

Autonomy and Automation Tool)– Sheridan’s Levels of Autonomy

ARLPenn State

NIST Definitions

Autonomous - Operations of an unmanned system (UMS) wherein the UMS receives its mission from the human <1> and accomplishes that mission with or without further human-robot interaction (HRI). The level of HRI, along with other factors such as mission complexity, and environmental difficulty, determine the level of autonomy for the UMS [2]. Finer-grained autonomy level designations can also be applied to the tasks, lower in scope than mission.

Autonomy - The condition or quality of being self-governing

[NIST Special Publication 1011 - Autonomy Levels for Unmanned Systems (ALFUS) Framework]

ARLPenn State

Sheridan’s Scale for Degrees of Automation

1. The computer offers no assistance, human must do it all2. The computer offers a complete set of action alternatives, and3. narrows the selection down to a few, or4. suggests one, and5. executes that suggestion if the human approves, or6. allows the human a restricted time to veto before automatic

execution, or7. executes automatically, then necessarily informs the human, or8. informs him after execution only if he asks, or9. informs him after execution if it, the computer, decides to.10. The computer decides everything and acts autonomously,

ignoring the human.

R. Parasuraman, T. B. Sheridan, and C. D. Wickens, "A Model for Types and Levels of Human Interaction withAutomation Transactions on Systems, Man, and Cybernetics -Part A, vol. 30, pp. 286-297, 2000

ARLPenn State

Future Combat Systems Levels of Autonomy

1. Remote control / teleoperation2. Remote control with vehicle state knowledge3. External preplanned mission 4. Knowledge of local and planned path5. Hazard avoidance or negotiation 6. Object detection, recognition, avoidance or

negotiation 7. Fusion of local sensors and data 8. Cooperative operations 9. Collaborative operations 10.Full autonomy

– SOURCE: LTC Warren O’Donell, USA, Office of the Assistant Secretary of the Navy (Acquisition, Logistics, and Technology), “Future Combat Systems Review,”April 25, 2003.

ARLPenn State

Levels of Autonomy as Defined by theUninhabited Combat Air Vehicle Program

• Level 1 (Manual Operation)– The human operator directs and controls all mission functions.– The vehicle still flies autonomously.

• Level 2 (Management by Consent)– The system automatically recommends actions for selected functions.– The system prompts the operator at key points for information or decisions.– Today’s autonomous vehicles operate at this level.

• Level 3 (Management by Exception)– The system automatically executes mission-related functions when response

times are too short for operator intervention.– The operator is alerted to function progress.– The operator may override or alter parameters and cancel or redirect actions

within defined time lines.– Exceptions are brought to the operator’s attention for decisions.

• Level 4 (Fully Autonomous)– The system automatically executes mission-related functions when response

times are too short for operator intervention.– The operator is alerted to function progress.

ARLPenn State

NIST Levels of Autonomy

We make a distinction between the terms of “degrees of autonomy” and “levels of autonomy.”Total autonomy in low-level creatures does not correspond to high levels of autonomy. Examples include the movements of earthworms and bacteria that are 100% autonomous but considered low.

[NIST Special Publication 1011 - Autonomy Levels for Unmanned Systems (ALFUS) Framework]

ARLPenn State

Air Force Research Laboratory (AFRL) Levels of Autonomy (Clough)

[ Met

rics,

Sch

met

rics!

How

The

Hec

k D

o Y

ou D

eter

min

e A

UA

V’s

Aut

onom

y A

nyw

ay?

Bru

ce T

. Clo

ugh,

Air

Forc

e R

esea

rch

Labo

rato

ry ]

ARLPenn State

NASA FLOAAT (Function-specific Level of Autonomy and Automation Tool)

[FLO

AA

T, A

Too

l for

Det

erm

inin

g Le

vels

of A

uton

omy

and

Aut

omat

ion,

A

pplie

d to

Hum

an-R

ated

Spa

ce S

yste

ms,

Rya

n W

. Pro

ud*

and

Jere

my

J.

Har

t†]

ARLPenn State

Automation Approach in the “Real” World

• Don’t over-commit on capability of automation• Begin by automating the mundane

– Bid and proposal database (Excel…)– Periscope key-in’s– Surface ship heading recommendations

• Extend by making some mildly intelligent inferences regarding decision-making– Go the right way around the wall– Not always simple: Cul-de-sac

• Extend to more complex “intelligent” systems…– Neural Nets– Fuzzy Systems– Rule-based Systems– Other techniques– Cognition?

• But… What is intelligence?

ARLPenn State

Intelligent Systems

• AIAA Intelligent Systems Technical Committee (JACIC, Dec., 2004), they stated:

"The question of what is an intelligent system (IS) has been thesubject of much discussion and debate. Regardless of how one defines intelligence, characteristics of intelligent systems commonly agreed on include:

1) Learning - capability to acquire new behaviors based on past experience; 2) Adaptability - capability to adjust responses to changing environmental or internal conditions; 3) Robustness - consistency and effectiveness of responses across a broad set of circumstances; 4) Information Compression - capability to turn data into information and then into actionable knowledge; and 5) Extrapolation - capability to act reasonably when faced with a set of new (not previously experienced) circumstances."

[courtesy: Lyle Long]

ARLPenn State

Some System Architectures

• Many options– NASA: CLARAty– MIT: MOOS (Framework for Modeling)– MIT: CSAIL (Robotic Reactive Planning)– CMU: SOAR (Cognitive Architecture)– CMU: CORAL (Cooperative Robots)

• Has won Robocup several times• Robocup Goal: “By the year 2050, develop a team of fully

autonomous humanoid robots that can win against the human world soccer champion team.”

– USC: STEAM (Agent Teamwork Model)– PSU/ARL: PIC (Behavior-based Framework)– PSU/IST: R-CAST (RPD Model for Agent Teamwork)– …

ARLPenn State INTELLIGENT CONTROL ARCHITECTUREINTELLIGENT CONTROL ARCHITECTURE

DATA INPUTS

Sensor 1

Sensor N

Messages

.

.

.

INTELLIGENT CONTROLLER

Perception Response

• Sensor Data Fusion• Information Integration• Inferencing and Interpretation • Situational Awareness

• Operational Assessment

• Dynamic Planning and Replanning

• Plan Execution

Messages

ConventionalControlSystems

Human Collaborator

Other AutonomousControllers

• Human-in-the-loop Operations (Collaborates / Commands)

ARLPenn State

System for Operator Workload Reduction

• Talked mostly regarding unmanned systems• Applicability versus a wide range of operational

systems– Let the operator have ultimate control (allow him to

control levels of autonomy)– Gain his confidence by …

• Helping him make better decisions• Not misleading to bad decisions

– Allow him to understand what the system is doing– Don’t provide him more of a burden to operate

• An example of a simple system…

ARLPenn State

Target Anesthesia/Analgesia Example

• Advisory System• Human-In-The-Loop• Information Overload• Subtle Combinatorial

Changes• Reduce 5-to-1

?

An example of a more complex system…

ARLPenn State Contact Awareness Example

• Reduces workload when making tactical maneuvering decisions– Reduce manual integration of information

• Reduce time to make maneuvering decision– Improve situational awareness holistic view

• Improve quality of tactical decision– Better situation understanding leads to better decision– Traceability to “truth” data

• Provide help for less experienced operator – Queue operator to predicted loss of tactical control– Incorporate SME expertise in automated recommendations with

ability to interrogate recommendation

ARLPenn State Contact Collision Threat Level

• CPA Concept

• Collision Threats • Orange to Red

• Level 0-1

• Violation Threats• Yellow to Orange

• Level 0-1

• No Threat Level• Green

2 Kyd

.500 yd

5 Kyd2 Kyd

.500 yd

5 Kyd

Speed in the line of sight (range rate)

Speed of Advance (SOA)

top related