advanced decision architectures for the warfighter

24
EDITED BY PATRICIA MCDERMOTT AND LAUREL ALLENDER ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER: FOUNDATIONS AND TECHNOLOGY

Upload: others

Post on 04-Feb-2022

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

E D I T E D B Y PAT R I C I A M C D E R M O T T A N D L A U R E L A L L E N D E R

ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER:

F O U N D AT I O N S A N D T E C H N O L O G Y

Page 2: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

i

A D VA N C E D D E C I S I O N A R C H I T E C T U R E S

F O R T H E WA R F I G H T E R :

F O U N D AT I O N S A N D T E C H N O L O G Y

E D I T E D B Y PAT R I C I A M C D E R M O T T

A N D L A U R E L A L L E N D E R

This book, produced by the Partners of the

Army Research Laboratory Advanced Decision Architectures

Collaborative Technology Alliance, also serves as the

Final Report for DAAD19-01-2-0009.

Page 3: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

 

Page 4: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

SECTION II. PRESENTING BATTLEFIELD INFORMATION TO WARFIGHTERS

SECTION II 

PRESENTING BATTLEFIELD INFORMATION TO WARFIGHTERS

Page 5: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

122  SECTION II

Page 6: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

Chapter 8

8.COMMUNICATION VIA THE SKIN: THE CHALLENGE OF TACTILE DISPLAYS 

LYNETTE A. JONES, PH.D.Department of Mechanical Engineering, 

Massachusetts Institute of Technology, Cambridge, MA

INTRODUCTION

Tactile communication systems represent a promising technology that can be used to present information to soldiers in a variety of contexts by utilizing a relatively underused sensory channel to convey information that is both private and discreet. Applications of this technology include delivering vibrotactile cues to soldiers to assist in navigation or threat location in the battlefield, providing tactile feedback to increase situational awareness in virtual environments used for training, and employing tactile cues to enhance the representation of information in multi-sensory displays used for planning and decision making. The opportunity to use the sense of touch in these diverse application domains arose as a result of tactile display technologies becoming more sophisticated due to the reduced power requirements of actuators used in the displays, and the option of wireless communication for mobile users (Jones & Sarter, 2008). This in turn made the displays less intrusive and thus more acceptable to users.

Tactile displays have achieved success as communication systems in a variety of contexts in which it has been shown that vibrotactile cues delivered to the skin can provide spatial information about the environment to pilots (Rupert, 2000), or alert the operator of a vehicle of an impending collision (Spence & Ho, 2008). The displays are typically composed of a matrix of electromechanical actuators (known as tactors) that are mounted in a vest or waistband and are sequentially

143

Page 7: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

144  JONES

activated to provide information about a person’s spatial orientation or the position or movement of a vehicle (McGraith et al., 2004). In most of these applications, the tactors are activated at a fixed frequency and amplitude and the number and location of the tactors simultaneously active is used to convey information. Tactile displays have also been employed in environments in which the visual and auditory communication channels are heavily taxed or in which visual displays are inappropriate because the operator is involved in other activities that require attention. In these latter situations, the sense of touch provides a communication channel that is direct, salient, and private.

The goal of the research conducted for the Advanced Decision Architectures Collaborative Technology Alliance (CTA) was to design and build wireless-controlled wearable tactile displays that could be used to evaluate the contexts and conditions in which tactile communication facilitated soldier performance in the battlefield, in vehicle or robot control, and in information exchange during planning for future combat missions. The domains in which it was envisaged that the displays would be used were those that required that the hands were free for other activities (e.g. holding a weapon, driving a vehicle). The display was therefore designed so that it could be worn on the arms, legs, or torso. The research at MIT focused on five issues: Can tactile signals be used to provide spatial cues about the environment that are accurately localized? How does the location and configuration of the tactile display influence the ability of the user to identify tactile patterns? What is the maximum size of a tactile vocabulary that could be used for communication? Which characteristics of vibrotactile signals are optimal for generating a tactile vocabulary? Can a set of Army Hand and Arm Signals be translated into tactile signals that are accurately identified when the user is involved in concurrent tasks? Within this framework, a series of laboratory and field studies was conducted.

The tactile displays fabricated at MIT were made available to other partners in the CTA and assistance was provided to help partners conduct experiments. Studies conducted at ARL that have used the MIT tactile display include those that have done the following: investigated the efficacy of tactile and multimodal alerts on decision making by Army Platoon Leaders (Krausman et al., 2005, 2007); analyzed the effectiveness of tactile cues in target search and localization tasks and when controlling robotic swarms (Hass, 2009); evaluated Soldiers’ abilities to interpret and respond to tactile cues while they navigated an Individual Movement Techniques (IMT) course (Redden et al., 2006); and measured the effects of tactile cues on target acquisition and workload of Commanders and Gunners and determined the detectability of vibrotactile cues while combat assault maneuvers were being performed (Krausman & White, 2006; White et al., in press). The MIT tactile displays have also been

Page 8: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

Tactile Displays  145

incorporated into multi-modal platforms developed by the University of Michigan, ArtisTech in the CTA test bed, and Alion MA&D for a robotics control environment. Finally, a comprehensive review of tactile displays was written in collaboration with Nadine Sarter at the University of Michigan (Jones & Sarter, 2008).

The organization of this chapter is as follows. The first section provides a brief overview of the basic design features of tactile displays. The next three sections summarize the major findings of a series of experiments that evaluated the use of tactile displays for communicating spatial information, simple instructional cues, and arm and hand signals. The chapter concludes with a discussion of the implications of tactile communication systems for the military.

TACTILE DISPLAY

Actuators and Wireless Tactile Control Unit

The development of tactile displays entails research in a number of areas, each of which impacts the decisions made in related domains. The initial task is the selection and characterization of the actuators used in the display. Different types of actuators have been used to create tactile displays, with small electro-mechanical motors (Figure 8.1) being the most common due to their size, availability, and low power requirements (Jones et al., 2006). The latter is an important issue for applications involving mobile users of tactile communication systems, such as soldiers in the battlefield. Other factors considered when choosing an actuator are its durability and safety.

Pancake motors (see Figure 8.1) were selected for the tactile displays based on the results from experiments in which the performance of several types of small actuators was compared (Jones et al., 2006). Pancake motors vibrate by rotating a mass in a plane parallel to the surface on which the motor is mounted. The motors are encased in plastic (Smooth-On, Inc) to make them more robust and increase the contact area between the skin and tactor. A wireless tactile control unit (WTCU) was designed and fabricated to control the motors (see Figure 8.1). The WTCU has two main components, a wireless transceiver module for communication with a notebook computer and a microcontroller that receives commands from the wireless module (Bluetooth) and translates them into sequences of motor actuation. The transceiver was required to ensure that the remote unit actually received commands and to determine whether the command was recognized as valid. This feedback is essential because the communication range for Bluetooth devices is variable depending on the type of operating environment. Dense urban environments tend to degrade the radio signal more

Page 9: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

146  JONES

than open terrain. Two motor driver integrated circuits were also required in the WTCU to drive the 16 motors in the display. The circuit was designed to make the most efficient use of the energy available to it. The power consumption of the WTCU with the Bluetooth connected and three motors vibrating is 848 mW (Lockyer & Jones, 2006).

Figure 8.1

(Left) Actuators used in tactile displays, clockwise from top of figure:  C2 tactor (Engineering Acoustics Inc.), encased pancake motor, Tactaid tac‐tor (Audiological Engineering Corp.), rototactor (Steadfast Technologies). From Jones & Sarter (2008), reproduced with permission of the Human Factors and Ergonomics Society. (Right) MIT wireless tactile control unit.

The software that operates the tactile display includes a graphical user interface (GUI) for the notebook computer as well as an assembly program for the microcontroller. GUIs were written in Microsoft Visual Basic .NET to run on a notebook computer. They provide the user with a list of commands to transmit to the motors and display the data returning from the display. The software interfaces with the computer’s COM (serial) port, which communicates using the RS-232 protocol.

Characterization of Actuators

The properties of the pancake motors were characterized in a series of experiments in which the forces and accelerations of the motors were measured while they were activated. These studies are essential to determining which stimulus variables (e.g. amplitude, frequency, waveform) can be controlled in tactors being considered for a specific tactile display. It was surprising to find in these experiments that the peak frequency of vibration varied considerably across tactors, ranging from 90 to 174 Hz, as illustrated in Figure 8.2, but that the frequency of vibration of an individual motor was consistent (Jones & Held, 2008).

Page 10: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

Tactile Displays  147

Figure 8.2Oscillation frequencies measured for each encased pancake motor. Each line on the x‐axis represents a different motor and each shape on the y‐axis shows the frequency of oscillation on a single trial. The motors are rank ordered by frequency. From Jones & Held (2008), reproduced with 

permission of the ASME.

This range of frequencies means that with the judicious selection of tactors it would be possible to create a display with pancake motors in which frequency encoded information (e.g. urgency of a message, proximity to a target). Measurements of the traveling waves caused by vibrating motors on the skin provide information about the optimal spacing of tactors in a display used for spatial cueing. For these motors, the surface wave is markedly attenuated at 40 mm from the point of activation and by 60 mm there is very little motion on the skin. A spacing of at least 50 mm would therefore be optimal for this class of tactor if the display is designed to provide precise spatial cues.

COMMUNICATION OF SPATIAL INFORMATION

Tactile displays hold particular promise for presenting spatial information about the environment, such as the location of a threat or the intended direction of navigation. The spatial coordinates of a vibrotactile stimulus delivered to the skin are accurately represented in the central nervous system, and so tactile cues on the body can be used to represent the location of an external event. Van Erp (2005) has shown that a one-dimensional array of tactors worn around the waist

80

100

120

140

160

180

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14

Motor number

Freq

uenc

y (H

z)

Page 11: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

148  JONES

is very effective for representing spatial information about the environment, and that it is quite intuitive to perceive an external direction emanating from a single point of stimulation. The ability to localize a point of stimulation varies across the body and is best when tactile stimuli are presented near anatomical landmarks such as the navel or spine (Cholewiak & Collins, 2003; Cholewiak et al., 2004).

Experimental Studies

A series of experiments was conducted to determine the accuracy of vibrotactile localization around the waist and on the back using the WTCU and tactors configured for the two sites tested (Jones & Ray, 2008). Ten normal healthy participants (five women, five men) aged between 19-24 years took part in each experiment. None of these individuals participated in both experiments.

Apparatus. Two tactile displays were created using the encased tactors. The waist display comprised eight tactors that were mounted on a waist band that had a strip of Velcro sewn along its length. The positions of the tactors were adjusted so that tactors were positioned over the navel, spine, right side above the hip and left side above the hip and at the mid-points between these locations as shown in Figure 8.3.

The spacing between tactors ranged from 80-100 mm across participants. The display used on the lower back comprised a four-by-four array of tactors mounted on a spandex waist band (see Figure 8.3). The spacing between the center points of the tactors was 40 mm in the vertical direction and 60 mm horizontally. This inter-tactor spacing is considerably greater than the threshold of 11 mm for distinguishing two sites of vibrotactile stimulation on the back (Eskildsen et al., 1969).

Procedure. Participants were first familiarized with the sensations associated with activating each tactor and during the experiment they indicated verbally which tactor had been activated, using a visual representation of the display. Each tactor was randomly activated for 500 ms on each trial and was presented five times, giving a total of 40 trials with the waist display and 80 trials with the back display.

Page 12: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

Tactile Displays  149

Figure 8.3

Tactile display around the waist (left) and on the back (right). From Jones and Ray (2008) with permission of IEEE press.

Results. Participants experienced little difficulty in locating a point of stimulation around the waist, with an overall response rate of 98% correct. This performance is similar to the 92% correct reported by Cholewiak et al. (2004), who also used an eight-tactor belt but with a different tactor (C2 tactor, see Figure 8.1) which vibrates at a higher frequency (250 Hz) and amplitude than the pancake motors.

Localization of a point of stimulation was more difficult with 16 tactors in the array, and on this task, participants were able to identify the correct location of stimulation on only 59% of the trials (range: 40-82%). The locations that were most accurately identified were the two corners in the upper row and those least accurately localized were in the middle of the second row as shown in Figure 8.4.

An analysis of variance indicated that there was a significant main effect of tactor (F(15, 135)=2.12, p=0.01) and that activation of tactors in the first row (most superior position) resulted in more accurate localization than activation of any other row (F(3,27)=4.56, p=0.01). Further analysis of the results revealed that most errors involved mislocalization by a single tactor. When responses were coded in terms of localizing the site of stimulation to within one tactor location, the overall response rate increased to 95% correct. Participants were also more likely to identify the correct column of tactor activation (87% correct) than the correct row (68%), which presumably resulted from the smaller difference in inter-tactor spacing in the vertical (40 mm) as compared to the horizontal (60 mm) direction.

Page 13: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

150  JONES

Figure 8.4The percent of correct responses in localizing which tactor was activated in a 16‐tactor array. Tactors were numbered sequentially from left to right beginning with 1 in the upper left hand corner of the display (see 

Figure 8.3). Adapted from Jones & Ray (2008).

Discussion. The results from these two experiments indicate that the ability to localize a point of stimulation on the body is affected by the number of tactors in the array and the inter-tactor spacing. When the number of tactors increases and the distance between the tactors decreases, spatial localization becomes more difficult. Several authors (e.g. Ho et al., 2005; Tan et al., 2003) have proposed that tactile cues delivered to different spatial locations on the back could be used to direct visual attention to the location of potential hazards when driving or to highlight on-screen information in safety critical tasks. The results from the present experiments suggest that a 16-tactor array on the back is unable to support precise spatial mapping between a tactile stimulus on the skin and the location of an external visual target. However, it is clear that an array with fewer tactors mounted over the same surface area would function effectively in providing spatial cues.

TACTILE COMMUNICATION – CREATING TACTONS

0

1020

3040

50

6070

8090

100

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Tactor number

Perc

ent c

orre

ct

Page 14: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

Tactile Displays  151

Tactile displays can also be used to communicate more abstract information, such as instructions (e.g. stop, attention, rally), and for this application it is important to evaluate the most effective way of presenting tactile cues to the user. Tactile warning signals have been implemented in a number of devices and vehicles to alert the user to an impending danger, but information that is more complex requires a tactile vocabulary. There are five basic dimensions of vibrotactile signals —frequency, intensity, duration, waveform and location—available for creating tactile patterns (Jones & Sarter, 2008). Among these, frequency and duration have most often been used to generate tactile stimuli (MacLean, 2008). For example, by grouping vibrotactile pulses of varying duration together, rhythms can be created that can encode cues such as the urgency of a message or the proximity of a vehicle. The development of a tactile vocabulary that can be used to communicate involves identifying what types of information can be recognized with minimal training and determining how messages that are usually conveyed visually or aurally can be converted into tactile patterns. Tactile signals that represent abstract messages are often called tactile icons or tactons, by analogy to visual icons and auditory earcons.

Tactile Patterns – Factors Influencing Recognition

A series of eleven experiments has been conducted using the WTCU and MIT tactile displays to determine the characteristics of tactile patterns that can be readily learned and identified (Jones et al., 2007, 2009; Lam, 2006; Margossian, 2007; Piateski, 2005; Piateski & Jones, 2005). In this research, the tactile displays were mounted on the arms, around the waist, and on the back as these sites are readily accessible and the displays do not impede hand or body movements. The site on the body on which a display is mounted depends on the application domain (i.e. mobile users in a battlefield, mobile or sedentary users interacting computer-based simulations) and factors such as privacy of communication, non-intrusiveness, and display robustness (Jones & Sarter, 2008). An advantage of using the torso is that the displays are discreet, and the communication is private.

One initial concern in selecting the torso as a site to mount displays was whether the added mass associated with carrying backpacks and wearing body armor would negatively affect the ability to detect tactile signals. An experiment was therefore conducted to measure tactile pattern recognition while stationary participants wore the Interceptor Body Armor (IBA vest). A set of eight tactile patterns was programmed into the WTCU and participants were required to identify which pattern was presented by the tactile display on the back (Figure 8.3) using the visual template of the patterns (see Figure 8.5). The results from eight participants indicated that the IBA vest had no effect on performance. For

Page 15: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

152  JONES

the 320 stimuli that participants identified, not a single error in identification was made (Lam, 2006). For this type of display and tactor, the additional mass (6.85-7.79 kg) did not affect performance. It appears that the gap between the IBA vest and the tactile display was sufficient to allow users to perceive the tactile patterns.

   

Figure 8.5Visual representation of the patterns used in several experiments with the four‐by‐four tactor array. The numbers, colors and arrows all repre‐sent the sequence of tactor activation. All patterns are of the same dura‐

tion, but vary in the number, location and temporal sequence of activation.

Other experiments revealed that these tactile patterns created by varying the spatial location and temporal sequence of tactor activation, and the number of tactors concurrently active can be accurately identified and used as cues for navigation and to provide simple instructions. For example in one field study (Jones et al., 2006), participants had no difficulty in learning that pattern A in Figure 8.5 meant “go straight ahead” and that pattern G signaled “stop at the

Page 16: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

Tactile Displays  153

next waypoint.” These studies were illustrative in demonstrating the importance of evaluating tactile pattern recognition in the context of the other patterns within a set that could be presented. The ability to identify a particular pattern varied dramatically (e.g. from 34% to 76% correct), depending on what other patterns were presented concurrently (Piateski, 2005).

Other experiments have been conducted using tactile displays mounted on the forearm, which may be preferred site for mounting displays that are used when interacting with virtual environments and when interpreting information presented on a computer screen. For tactile displays mounted on the arm, Jones et al. (2009) found that tactile patterns that involved a pattern of tactor activation in two directions, for example across and up the arm were misidentified more frequently than patterns in which the course of tactor activation was limited to a single direction (e.g. left/right or up-down). These findings indicate that tactile patterns that limit the course of tactor activation within a pattern to a single direction will be more accurately identified.

In addition to the spatial sequence of tactor activation, changes in the intensity of a vibrotactile signal can be used to create tactile messages. Such cues could convey information about the proximity of a hazard or obstacle when navigating, or the presence of a restricted area during flight. The intensity of a vibratory stimulus can be modulated by increasing the amplitude of a single vibratory stimulus or by varying the number of tactors activated simultaneously. Cholewiak (1979) has shown that the perceived intensity of a vibratory stimulus increases as the number of tactors activated increases from 1 to 64. This suggests that by varying the number of tactors concurrently active in the display, the perceived intensity of the stimulus will vary.

Amplitude discrimination. An experiment was conducted with ten participants in which their ability to discriminate between two vibratory stimuli of varying intensity was measured. The intensity of the vibrotactile stimulus was controlled by varying the number of tactors simultaneously activated. The 16-tactor display was mounted on the back (see Figure 8.3) and participants were required to indicate which of two stimuli presented sequentially was stronger. There were eleven combinations of tactors and each was presented six times. The maximum number of tactors that can be activated simultaneously with the WTCU is four, and so the greatest intensity difference between two patterns was the one in which one and then four tactors were activated.

The results indicated that when there was a difference of at least two activated tactors between two signals (1 vs 3 or 2 vs 4), participants almost always chose the signal with more tactors activated as the stronger stimulus. When the difference in signal intensity was due to a single tactor, the ability to discriminate between two stimuli depended on the number of tactors activated in each stimulus, as can be seen in Figure 8.6. When comparing 1 versus 2 tactors,

Page 17: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

154  JONES

participants were correct on 88% of the trials, but their performance fell to 73% correct when comparing 2 versus 3 tactors. These findings provide guidance as to how intensity can be encoded reliably in a tactile display, and indicate that for this type of tactor a difference of at least two tactors activated should be used as an intensity cue.

Figure 8.6The percent of correct responses when choosing the more intense stimu‐lus in pair of vibrotactile stimuli. The number of concurrently activated 

tactors is shown above each bar.

TACTILE COMMUNICATION – ARM AND HAND SIGNALS

The Army Hand and Arm Signals represent a code that is presented visually to convey simple commands (e.g. take cover, advance to left) for which tactile analogs could readily be developed. A set of eight arm-and-hand signals was converted into tactile patterns that retained some of the iconic information of the hand signal. For example, the hand signal for “take cover” involves lowering the outstretched arm from above the head to the side and so the tactile analog involved activating the four rows of tactors sequentially down the back as illustrated in Figure 8.7.

Page 18: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

Tactile Displays  155

Figure 8.7Schematic representation of tactile patterns, associated arm‐and‐hand 

signals and commands. Adapted from Jones et al., 2009.

Experimental Studies

Page 19: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

156  JONES

In the contexts in which these displays will be used for communication (i.e., mobile soldiers in the battlefield), users must be able to recognize and respond to the meaning of the tactile signal as it is presented while they are engaged in other activities. A series of experiments was therefore conducted to examine tactile pattern recognition while the user was involved in a concurrent task, two of which were physical, the other cognitive. If participants can identify these tactile analogs of the arm-and-hand signals accurately, then this indicates that more abstract information, such as instructions or warnings, can be presented tactually and used in situations in which visual and auditory communication is restricted due to either safety or privacy concerns, or ambient levels of light or noise.

The eight tactile patterns (Figure 8.7) were presented on the four-by-four tactor display mounted on the back (seeFigure 8.3). Each pattern was presented three times. Ten adults (five men and five women) aged between 18 and 38 years participated in the experiments.

Procedure. Participants were first trained to associate the tactile pattern with its meaning. During this phase, which lasted approximately five minutes, participants viewed the schematic of the patterns and associated hand signals as illustrated in Figure 8.7. During the experiments, participants did not view the visual template and had to respond to the tactile signal by verbally identifying the pattern. In one set of experiments, participants were either walking or jogging while the tactile stimuli were delivered, and in the other they were engaged in a word-generation task on a computer.

Results. The results from the three experiments are shown in Figure 8.8. The overall mean correct response rates were 91% (walking), 91% (jogging) and 93% (computer task), which demonstrates that participants were able to identify tactile patterns despite the performance of a concurrent task and that the body movements associated with walking and jogging did not impair perception. Moreover, in these experiments participants were not alerted about the upcoming tactile signal, and were still able to process the tactile cues accurately. Most participants reported that they remembered the tactile cues by visualizing the pattern and attended to the location at which the signal began.

Tactile communication systems are often evaluated in terms of their information transmission capabilities. Information transfer (IT) measures the increase in information about a signal transmitted that results from knowledge of the received signal (Tan et al., 1999). For tactile communication systems involving the hand, the static IT varies with the dimensionality of the tactile stimuli and the number of fingers involved in perceiving the stimuli (Tan et al. 1999). The IT measured on the back during this experiment ranged from 2 to 3 bits for stimuli that are considered to be two-dimensional. These values are comparable to those measured for other tactile communication systems presenting stimuli with similar dimensionality.

Page 20: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

Tactile Displays  157

Figure 8.8Group mean percentage of correct responses in identifying tactile pat‐terns representing arm‐and‐hand signals while walking (white bars), jog‐

ging (gray bars) and performing a cognitive task (black bars).

CONCLUSIONS

The results from this research on tactile displays clearly demonstrate that simple navigational and instructional commands presented tactually on the arm or back can be identified quickly and accurately. Much of the previous research on tactile communication systems has focused on the more sensitive skin on the palmar surface of the hand and fingers and so the present results are important in demonstrating the capabilities of other areas on the body for communication. These sites are non-intrusive, do not impede movements, and have the added advantage that the displays can be worn under clothing and so are protected and concealed.

0

20

40

60

80

100

1 2 3 4 5 6 7 8

Hand signal

Perc

ent c

orre

ct

Page 21: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

158  JONES

The configuration of the tactile displays used in this research (i.e., belt, sleeve, vest) varied as a function of the site on the body on which the display was mounted. This in turn influenced the ability to identify tactile patterns. For tactile pattern recognition, the forearm was determined to be inferior to the back. This probably reflects the smaller inter-tactor distance on the arm (24 mm as compared to 40-60 mm on the back), the reduced intensity of the tactile stimulus with three rather than four tactors activated for most patterns, and the shorter duration of most of the patterns presented on the arm (Jones et al., 2009). Nevertheless, with appropriate selection of patterns, participants were able to identify accurately up to eight patterns displayed on the arm.

An important question emerges in the development of tactile communication systems which concerns the potential size of a vocabulary that could be used for communication. A number of experiments was conducted using the four-by-four tactile display mounted on the back in which the accuracy of pattern recognition was evaluated as a function of the number of patterns presented. To date, the maximum vocabulary tested had 15 unique elements and participants were able to identify these with an accuracy of 96% correct (Jones et al., 2007). Further experiments with larger vocabularies will need to be undertaken in order to determine the upper limit of a tactile communication system.

It was also important to establish in this research that participants did not need extensive training to become familiar with the tactile patterns presented. Clearly, for this technology to be useful for soldiers, learning a tactile vocabulary should not require a prolonged period of training. With the exception of a tactile alert, tactile patterns do not have intrinsic meaning and so users will always need to become familiarized with the instruction or cue associated with a particular pattern. In all of the experiments conducted to date, the time required to become familiar with the patterns was less than 5 minutes and participants had no difficulty in relating a tactile pattern to a simple instruction or command.

The tactile displays used in these experiments stimulated the skin at a fixed frequency and amplitude and the location and number of tactors concurrently active was varied to convey information. These two variables were used to create tactile patterns as they represent dimensions that are reliably perceived. Other dimensions of vibrotactile signals such as frequency and the temporal pattern of activation hold promise as additional cues that could be used to create tactile vocabularies in future research.

This research has demonstrated the potential of tactile communication systems for the Army, has highlighted the application areas that hold most promise, namely spatial cueing and navigation, and has shown that a simple tactile vocabulary can be understood and responded to as people are engaged in other activities.

Page 22: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

Tactile Displays  159

REFERENCES

Cholewiak, R.W. (1979). Spatial factors in the perceived intensity of vibrotactile patterns. Sensory Processes, 3, 141‐156.

Cholewiak, R.W., & Collins, A.A. (2003). Vibrotactile localization on the arm: Effects of place, space, and age. Perception & Psychophysics, 65, 1058‐1077.

Cholewiak, R.W., Brill, J.C., & Schwab, A. (2004). Vibrotactile localization on the abdomen: Effects of place and space. Perception & Psychophysics, 66, 970‐987.

Eskildsen, P., Morris, A., Collins, C.C., & Bach‐Y‐Rita, P. (1969). Simultaneous and successive cutaneous two‐point threshold for vibration. Psychonomic Science, 14, 146‐147. 

Hass, E. (2009, February). Review of multimodal (audio, tactile and visual) display research. Paper presented at the Tactile Workshop, Human Research and Engineering Directorate, U.S. Army Research Laboratory, Aberdeen, MD. 

Ho, C., Tan, H.Z., & Spence, C. (2005). Using spatial vibrotactile cues to direct visual attention in driving scenes. Transportation Research Part F: Traffic Psychology and Behaviour, 8, 397‐412. 

Jones, L.A., & Held, D.A. (2008). Characterization of tactors used in vibrotactile displays. Journal of Computing and Information Sciences in Engineering, 8, 044501‐1‐5.

Jones, L.A., & Ray, K. (2008). Localization and pattern recognition with tactile displays. In Proceedings of the Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, (pp. 33‐39). Los Alamitos, CA: IEEE Computer Society.

Jones, L.A., & Sarter, N. (2008). Tactile displays: Guidance for their design and application. Human Factors, 50, 90‐111. 

Jones, L.A., Kunkel, J. & Piateski, E. (2009). Vibrotactile pattern recognition on the arm and back. Perception, 38, 52‐68.

Jones, L.A., Kunkel, J., & Torres, E. (2007). Tactile vocabulary for tactile displays. In Proceedings of the second joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, (pp. 574‐575). Los Alamitos, CA: IEEE Computer Society.

Jones, L.A., Lockyer, B., & Piateski, E. (2006). Tactile display and vibrotactile pattern recognition on the torso. Advanced Robotics, 20, 1359‐1374.

Krausman, A.S., & White, T.L. (2006). Tactile displays and detectability of vibrotactile patterns as combat assault maneuvers are being performed (ARL‐TR‐3998). Aberdeen Proving Ground, MD: U.S. Army Research Laboratory. 

Krausman, A.S., Elliot, L.R., & Pettitt, R.A. (2005). Effects of visual, auditory, and tactile alerts on platoon leader performance and decision making (ARL‐TR‐3633). Aberdeen Proving Ground, MD: U.S. Army Research Laboratory.

Krausman, A.S., Pettitt, R.A., & Elliot, L.R. (2007). Effects of redundant alerts on platoon leader performance and decision making (ARL‐TR‐3999). Aberdeen Proving Ground, MD: U.S. Army Research Laboratory.

Lam, A. (2006). Vibrotactile pattern recognition on the torso with one and two dimensional displays. Unpublished bachelor’s thesis, Massachusetts Institute of Technology, Cambridge, MA.

Lockyer, B., & Jones, L.A. (2006). Operation manual for the MIT Wireless Tactile Control Unit (WTCU). Cambridge, MA: Massachusetts Institute of Technology.

MacLean, K.E. (2008). Foundations of transparency in tactile information design. IEEE Transactions on Haptics, 1, 84‐95.

Page 23: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER

160  JONES

Margossian, C. (2007). Vibrotactile pattern recognition on the torso: Effects of concurrent activities. Unpublished bachelor’s thesis, Massachusetts Institute of Technology, Cambridge, MA.

McGraith, B.J., Estrada, A., Braithwaite, M.G., Raj, A.K., & Rupert, A.H. (2004). Tactile situation awareness system. Flight demonstration final report. USAARL Report No. 2004‐10. 

Piateski, E. (2005). A tactile communication system for navigation. Unpublished master’s thesis, Massachusetts Institute of Technology, Cambridge, MA.

Piateski, E., & Jones, L. (2005). Vibrotactile pattern recognition on the arm and torso. In Proceedings of the first joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, (pp. 90‐95). Los Alamitos, CA: IEEE Computer Society.

Redden, E.S., Carstens, C.B., Turner, D.D., & Elliott, L.R. (2006). Localization of tactile signals as a function of tactor operating characteristics (ARL‐TR‐3971). Aberdeen Proving Ground, MD: U.S. Army Research Laboratory.

Rupert, A.H. (2000). An instrumentation solution for reducing spatial disorientation mishaps. IEEE Engineering in Medicine and Biology Magazine, March/April, 71‐80.

Spence, C., & Ho, C. (2008). Tactile and multisensory spatial warning signals for drivers. IEEE Transactions on Haptics, 1, 121‐129.

Tan, H.Z., Durlach, N.I., Reed, C.M., & Rabinowitz, W.M. (1999). Information transmission with a multifinger tactual display. Perception & Psychophysics, 61 993‐1008.

Tan, H., Gray, R., Young, J.J., & Traylor, R. (2003). A haptic back display for attentional and directional cueing. Haptics‐e, 3, Article 1. Retrieved November 2 2007 from http://haptics‐e.org/Vol_3/he‐v3n1.pdf. 

Van Erp, J.B.F. (2005). Presenting directions with a vibrotactile torso display. Ergonomics, 48, 302‐313.

White, T.L., Kehring, K.L., & Glumm, M.M. (in press). Effects of unimodal and multimodal cues about threat locations on target acquisition and workload. Military Psychology.

 

Page 24: ADVANCED DECISION ARCHITECTURES FOR THE WARFIGHTER