alyxander may may11213081 mcomp project

65
Conveying Robot Navigation Intention to Humans Alyxander David May MAY11213081 Computer Science, MComp The University of Lincoln April 2015

Upload: alyxander-may

Post on 12-Apr-2017

10 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Alyxander May MAY11213081 MComp Project

Conveying Robot Navigation

Intention to Humans

Alyxander David May

MAY11213081

Computer Science, MComp

The University of Lincoln

April 2015

Page 2: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | ii

Conveying Robot Navigation Intention to Humans

Alyxander David May

Page 3: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | iii

Acknowledgments

First and foremost I would like to thank my supervisor Dr Marc Hanheide,

even before the onset of this project he showed the faith in me that I needed to push

through. Without his invaluable support in robotics and more so the correct research

methodologies to follow surely this project couldn’t have been achieved to this level.

Without Christian Dondrup’s support and in depth knowledge of aspects

ROS, Python and much more I wouldn’t have been able to meet my deadlines.

Always approachable and able to offer help when needed his expertise were of vital

importance to me.

Matthew Ashton for providing me with all I needed to create a functional

workspace to continue on my work through to the small hours of the morning.

A thanks to all the members of the STRANDS team for all the hard work they

have done in developing the Scitos G5, to get it to the current state and allow this

project to be run using one of their robots.

Lincoln Centre for Autonomous Systems and Professor Tom Duckett for

allowing me to use their office and workspaces for the lifecycle of this project.

Finally, a thanks to all my fellow peers for pushing me through this project;

special thanks to all those who participated in the experiment.

Page 4: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | iv

Abstract

The general aim is to gauge and measure in terms of comfort felt and ease of

understanding, how human-robot interaction can be enhanced by implementing and

empirically evaluating methods of expressing navigational intent on a mobile robot.

Three behaviours were developed for displaying navigational intent: No Signal,

Indicators and Move Head. These behaviours were then evaluated by designing and

running an experiment to empirically evaluate them. The robot used was a Scitos

G5 with a Human Machine Interface and Indicators attached.

The experiment ran over the course of three days with ten participants, who each

filled out a survey attempting to ascertain: how easy it was to understand the intent

of the robot, how quickly they could understand the robot and how comfortable they

felt around the robot. Of the behaviours Indicators was the most preferred in each

aspect, with a statistically significant different mean difference compared to No

Signal, and a higher average score on each aspect than Move Head. The minimum

distance kept from the robot was also analysed, with participants keeping a statically

significant further distance when the robot was using Indicators compared to No

Signal, and an average further distance compared to Move Head.

Participants were also asked to choose which of the behaviours they would most

like to see implemented as a standard for social robotics, four selected Move Head

a believed it felt most natural all with self-confess little or no knowledge of robotics.

While six selected Indicators, as they believed it is the most obvious, most with some

or more robotics knowledge.

Page 5: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | v

Table of Contents

Acknowledgments ....................................................................................... iii

Abstract ....................................................................................................... iv

List of Tables & Figures .............................................................................. vii

1 Introduction ............................................................................................ 1

1.1 Background ..................................................................................... 1

1.2 Aims & Objectives ........................................................................... 2

1.2.1 Aim ............................................................................................. 2

1.2.2 Objectives .................................................................................. 3

1.3 Rationale ......................................................................................... 3

2 Literature Review ................................................................................... 4

2.1 Human-Robot Interaction ................................................................ 4

2.2 Human-Robot Spatial Interaction .................................................... 6

2.3 Robot Navigational Signals ............................................................. 8

3 Design .................................................................................................... 9

3.1 Project Management ....................................................................... 9

3.1.1 Traditional .................................................................................. 9

3.1.2 Supervision & Deliverables ...................................................... 11

3.1.3 GitHub ...................................................................................... 12

3.2 Research Methodology .................................................................. 12

4 Software Development ......................................................................... 16

4.1 Tools & Machine Environments ..................................................... 16

4.1.1 Robot Operating System .......................................................... 16

4.1.2 Ubuntu ..................................................................................... 16

4.1.3 Python ...................................................................................... 16

4.1.4 Robots ..................................................................................... 17

4.1.5 SCITOS G5 .............................................................................. 19

Page 6: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | vi

4.2 Development Methodology ............................................................ 20

4.2.1 Incremental Development ........................................................ 20

4.2.2 Action Lib ................................................................................. 21

4.2.3 Testing ..................................................................................... 22

5 Experiment ........................................................................................... 23

5.1 Design ........................................................................................... 23

5.1.1 Physical ................................................................................... 23

5.1.2 Robot Setup ............................................................................. 24

5.1.3 Participants .............................................................................. 25

5.2 Results .......................................................................................... 26

5.3 Evaluation ...................................................................................... 28

5.3.1 Questionnaire Sections 1, 2 & 3............................................... 28

5.3.2 Questionnaire Section 4 ........................................................... 33

5.3.3 Robot Data ............................................................................... 35

5.3.4 Summary ................................................................................. 37

5.4 Limitations ..................................................................................... 38

5.5 Future Work ................................................................................... 40

5.6 Conclusion ..................................................................................... 42

6 Personal Reflection .............................................................................. 43

7 Bibliography ......................................................................................... 45

8 Appendices .......................................................................................... 51

Page 7: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | vii

List of Tables & Figures

Figure 1 - Traditional Project Management (Project Lifecycle Services Ltd,

2014) ...................................................................................................................... 9

Figure 2 - Empirical Research Cycle (Explorable, 2015) ........................... 13

Figure 3 – Rovio (Robotdom, 2015)........................................................... 17

Figure 4 – Turtlebot (Turtlebot, 2015) ........................................................ 18

Figure 5 – inMoov (3diot, 2014) ................................................................. 18

Figure 6 - Pioneer 3 – AT (Unmanned Vechicle Centre, 2015) ................. 18

Figure 7 – Scitos G5 (Lincoln Centre for Autonomous Systems, 2015) ..... 19

Figure 8 - Robot Comparison Matrix .......................................................... 19

Figure 9 - Waterfall vs Incremental (Bittner, 2006) .................................... 20

Figure 10 - Client-Server Interaction (ROS, 2015) ..................................... 21

Figure 11 - ROS Topic Graph .................................................................... 22

Figure 12 – Move Head Behaviour: left, Straight, Right ............................. 24

Figure 13 – Indicate Behaviour: Left, Straight, Right ................................. 24

Figure 14 - Available Robot Paths ............................................................. 25

Figure 15 - Survey Results ........................................................................ 27

Figure 16 - Robot Area of Interest for Minimum Distances ........................ 27

Figure 17 - Minimum Distance from Robot (m) .......................................... 28

Figure 18 - Results Question 1 .................................................................. 28

Figure 19 - Score Differences for Question 1 ............................................ 29

Figure 20 - Results Question 2 .................................................................. 30

Figure 21 - Score Differences for Question 2 ............................................ 31

Figure 22 - Results Question 3 .................................................................. 32

Figure 23 - Score Differences for Question 3 ............................................ 33

Figure 24 - Average Minimum Distances ................................................... 35

Figure 25 - Behaviour Distribution ............................................................. 36

Page 8: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 1

1 Introduction

1.1 Background

Robots are becoming increasingly more common in industrial environments

“…up to 2008 about 63,500 service robots for professional use were sold during a

period of more than 12 years. However, during the past five years some 100,000,

service robots for professional use were sold according to the results of these

statistics.” (International Federation of Robotics, 2014). Alongside this it is also

believed that robotics for the elder and disabled will increase with a forecast of

12,400 units to be sold in 2014-2017, “This market is expected to increase

substantially within the next 20 years.” (VDMA, 2013), with this increase in robots it

becomes more important to look at Human-Robot Interaction (HRI).

A key principle of mobile robotics is to achieving safe operation in the

presence of humans (Steinfeld, et al., 2006). Robots currently are successfully able

to navigate through environments including circumnavigating obstacles. However,

humans must be treated differently to ensure they feel safe and comfortable around

robots. “To improve the robot behaviour, we conducted a human-human experiment

to find a socially plausible strategy to behave in such situations.” (Kruse, et al.,

2012). A study conducted by Kruse, looked at how people perceived each other’s

movements in a path crossing scenario and what they preferred. This is where the

need for Human-Robot Spatial Interaction (HRSI) has come in, not only do robots

need to be aware of humans, but they need to be traversed in a way they are

comfortable with. By implementing a direct navigational intention signal from a robot,

humans will have a better understanding of what the robot intents; which leads to

better perception and comfort with robots. A study carried out by Breazeal et al,

identified that humans overall had a more positive interaction with a mobile robot

which game them non-verbal signals, i.e. a head nod or an eye blink, compared to

that of no signal (Breazeal, et al., 2005).

With the advance in robotics, we increasingly use robots to fulfil tasks we as

humans consider too dangerous, fatuous or inherently tedious. One area this can

be seen is in the care industry, where HRSI is paramount for safety of people either

ill, disabled or elderly. The STRANDS project, an EU commissioned research

Page 9: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 2

project is currently delving into this environment “STRANDS will produce intelligent

mobile robots that are able to run for months in dynamic human environments.”

( Automation and Control Institute, 2015). One of the partners for this project is Haus

der Barmherzigkeit (translates to ‘House of Mercy’) an Austrian elderly care home

in Vienna. Clients of Haus der Barmherzigkeit may suffer from various cognitive

and/or motoric disabilities, this may mean they have to impaired movement and/or

judgment (Haus der Barmherzigkeit, 2014). Therefore, a smooth easy to understand

action from a mobile robot in their environment is imperative to their comfort; more

so as it will be their own home. Another part of the STRANDS project is looking into

mobile security working alongside security guards “This is the first time that an

autonomous robot has been deployed in a working office environment to do a real

job.” (G4S, 2014). Again, in this environment it is imperative the robot is aware of

humans and how to interact and navigate around them, with other humans also

competing tasks alongside them.

1.2 Aims & Objectives

1.2.1 Aim

The general aim is to gauge and measure in terms of comfort felt and ease

of understanding, how human-robot interaction can be enhanced by implementing

and empirically evaluating methods of expressing navigational intent on a mobile

robot. Three different methods of expressing navigation intention will be

implemented and empirically reviewed. With these implementations it is hoped

improvements can be made in how humans perceive the navigational intent of

mobile robots through more legible signalling, as well as giving humans more

confidence working around or with mobile robots as they become part of everyday

life. By implementing various techniques of non-verbal communication, hypotheses

surrounding each of these implementations will be formed.

An experiment will be used to ascertain the research data. By applying

quantitative and qualitative research methods hypotheses will be tested and

evaluated, to see if there appears to be any significant differences between the

implementations.

Some of the key aspects to look at are: ‘Ease of understanding the

behaviour’, ‘Speed of understanding the behaviour’, ‘Was the subject comfortable

Page 10: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 3

around the behaviour’, etc. As can be seen from the questions, the main aspect is

to ascertain participant’s views on the various behaviours and is a very person

centred approach, with the participant being the key factor and not how the robot

responds.

1.2.2 Objectives

1) Continue literature review to identify three methods of expressing

navigational intent that are legible, safe and effective, as well as formation of

hypotheses for later testing

2) Implement chosen methods using ROS in Python and testing for

reliability using either a robot or a simulation environment

3) Create and run an experiment to test human perception for each

implementation, with appropriate post-study survey for data gathering

4) Use appropriate quantitative and qualitative methods to review data

acquired from the experiment and test against hypotheses

1.3 Rationale

The study is inherently a research project in that the key principle includes

divulging into an area which is currently strongly researched; HRI and more

specifically HRSI. With being a research project it is important to understand that

the key components are the results attained during the experiment and the

reflections they have on the implemented system, rather than the system as a

whole.

The hope of the project is to answer questions about how signals can improve

the comfort felt by humans around mobile robots. The work will look to answer

questions that have been purposed by Dondrup et al. by looking at the psychology

behind using different signals to achieve more comfortable navigational intent

(Dondrup, et al., 2015). Principles will be used including signals previously

implemented and discussed by Peters et al., who have left scope for their research

to be continued alongside Dondrup et al.’s with signals being implemented from both

(Peters, et al., 2011). This gap in research can be seen in a recent survey of Human

aware navigation by cruse et al, the finds much in the way of robots navigating to

and around humans and the comfort attached with this, navigational intent signals

is not addressed (Kruse, et al., 2013).

Page 11: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 4

2 Literature Review

One of the biggest issues to remember whilst conducting any research with

robots is that they have limitations and you have to work within these limitations. As

described by Dudek and Jenkin “It is interesting to note that fictional robots usually

do not suffer from the computational, sensing, power, or locomotive problems that

plague real robots.” (Dudek & Jenkin, 2010). Although, this work shouldn’t be

resource hungry, it is still important aspect to remember in any robot related

research, it might not be perfect due to various limitations.

On the other hand it is also important to understand the scope of current

robotics. They range from replacing the aging workforce in the automobile

manufacturing industry (Byrant, 2014) to helping interpretation and the transport of

food to people in West Africa aiding the Ebola outbreak (Gaudin, 2014) and helping

in with bomb disarmament and disposal in war zones (Defence, 2014) and many

more in between. For the purpose of this project a mobile personal robot will be

used.

2.1 Human-Robot Interaction

Within social robotics, some people are more experienced in interacting with

robots than others, this can lead to possible positive and negative connotations for

research, but also biased. A study conducted by Hall et al., concluded that

“Participants who self-reported greater robotics knowledge reported higher overall

engagement and greater success at developing a relationship with the robot.” (Hall,

et al., 2014). This study shows the importance of ascertaining people’s robotics

experience in social robotics research. This work wishes to look at the comfort felt

by participants (amongst other aspects), thus it is important during the experiment

to ascertain the robotics experience of participants.

Human perception is constantly changing for robots, this can lead to a more

natural and comfortable interaction for humans. Hansen et al., show this change in

perception by researching into gesture recognition with mobile robots, specifically

they attempt create a robot system that analyses a participants posture and interact

with them accordingly (Tranberg Hansen, et al., 2009). The system shows clear

machine learning with the ability to adapt to different human behaviours over time.

A similar study by Svenstrup et al. uses a similar approach to gauge human interest

Page 12: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 5

in an interaction; using a three variable Case Based Reasoning system similar to

that used Tranberg Hansen et al. (Svenstrup, et al., 2009). Both of these studies

show examples of a robot learning to perceive humans in different scenarios to

understand their needs. This work intends to look at the inverse; looking at how

humans perceive the robot and understating the needs and intention of the robot in

the form of navigational intent.

Robot signalling is an area increasingly research, this can be seen in the

work of Mutlu et al., who investigate the psychological effects of a robot using gaze

cues for a turn taking conversation in a group (Mutlu, et al., 2009). The study

concludes those in the conversational groups “…also felt more acknowledged,

welcomed, and valued by their group, and that they belonged more to the group…”

Similar work has been done by Bennewitz et al.; who implemented an informative

humanoid robot to give information to patrons at a museum (Bennewitz, et al.,

2005). The experiments found that “Almost all people found the eye-gazes,

gestures, and the facial expression human-like and felt that Alpha was aware of

them.” (Alpha was the humanoid robot used in the study). Both of these show

humans feel more involved in a conversation situation when receiving signals or

gazes from a robot. It’s the hope of this work to see if these concepts can be

transposed to navigational signals; to increase the comfort felt for humans by

receiving navigational signals. Another point raised by these studies is eye gaze

feeling natural, as such; eye gaze will be one of the signals used in this study to

display navigational intent.

A study at Massachusetts Institute of Technology by Breazeal et al.,

conducted an experiment in which participants were tasked to teach a robot the

location of buttons and to press the corresponding button (Breazeal, et al., 2005).

The control group received no feedback from the robot, whereas the other test group

received a visual facial expression, a nod for understanding and a confused face for

non-understanding. “H1: Subjects are better able to understand the robot’s current

state and abilities in the IMP+EXP case”, this hypothesis states the authors believe

participants will be able to better understand the robot when it expresses

understanding using facial expression. From the results it was concluded “There

was a significant difference between the two manipulators on answers to questions

about subject’s ability to understand the robot’s current state and abilities. Thus

Page 13: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 6

Hypothesis 1 is confirmed…” The study shows that humans are better suited to

understand the intention of a robot when it is display signals to them. In this work, it

is intended to follow on using these principles to see if they apply to other aspects

of HRI specifically Spatial Interaction and whether signals improve the

understanding navigational intent.

2.2 Human-Robot Spatial Interaction

Torta et al., conducted an experiment in which a human sat with a humanoid

robot approaching from various angles and distances to initiate communication

(Torta, et al., 2011). The study showed that the optimal approach angle for

participant comfort was directly in front rather than from the side. A similar study, by

Walters et al., use standing participants and a mobile robot (Walters, et al., 2011).

The effects looked at here are long term comfort rather than short term, results

showed that participants felt more comfortable approaching the robot in a confined

space compared to being approach by the robot in the same confined space. This

work will continue on the themes shown in these studies, the principle will be taken

from Torta et al., to use a head on encounter, similar to a corridor passing behaviour,

whilst also allowing the participants to move in and around the robot in whatever

way feels comfortable to them; the work attempts to build on what has been

previously shown by trying to investigate in which scenarios humans feel more

comfortable around mobile robots.

The distance, speed and direction a robot moves in relation to a human is

important in achieving the most comfortable HRSI. Work by Pacchierotti et al., looks

into the feelings of humans at varying distances and speeds in a hallway setting

(Pacchierotti, et al., 2005). The results show that “The best behaviour was,

according to all the subjects, behaviour 8 (see Table III), i.e. the one with highest

speed and largest signaling and lateral distances”. Although the scope of this work

doesn’t include looking into varying speeds and distances, it is an important point to

note in that participants’ feelings may be greatly affected by the speed and or

distance the robot takes in in avoidance path. For this reason, it will be established

that the same speed, angle and path will be taken by each different behaviour, to

ensure consistency.

Page 14: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 7

What kind of behaviour do humans expect from moving robots? One aspect

of this question, a path crossing scenario was investigate by a Lichtenthaler et al.

they attempted to look at how humans wanted a robot to react in a direct path

crossing situation and to test various implementations they created an experiment

(Lichtenthäler, et al., 2013). The authors concluded that the ideal situation is where

the robot the heads directly towards the goal and only changes its path by halting if

it were to invade the personal space of a crossing human. This was also the scenario

in which the participants felt most comfortable around. A similar study by Dondrup

et al., looked at whether hesitation signals could be seen in as path crossing

scenario (Dondrup, et al., 2014). The results show “that hesitation signals can be

found in head-on encounters during pass-by scenarios”, these could lead to

increased stress and less comfortable feelings around robots. In both of these

experiments, the concept of navigation intent signals could have made a difference

to how comfortable and confident participants felt around the robots. This work will

attempt to look at the comfort felt during the pass by scenario and evaluate any

differences between various forms of navigational intent.

One way being investigated of improving the comfort felt by humans around

social robots it to look at robots exhibiting more “natural” behaviour. Saulnier et al.,

look at using robot body language to catch people’s attention. In their study a robot

approach a group of two people and exhibited one of three behaviours: ignore and

pass by, approach with quick and erratic behaviour or approach slowly. The

participants confirmed they felt navigational behaviour conveyed messages from the

robot (Saulnier, et al., 2011). Althuas et al. conducted a study in which a robot

approached a group of people, by approaching the middle of the group body facing

toward the group. The subjects felt the behaviour exhibited by the robot felt natural

(Althaus, et al., 2004). Finally a study by Satake and Hayashi conducted

experiments with of robots approaching humans in a shopping centre (Satake, et

al., 2009) & (Hayashi, et al., 2011). They found people felt it was most comfortable

for the robot to approach from the front. It is important to remember the principle of

“The Uncanny Valley” in which Mori describes the phenomenon of robots becoming

too human-like can cause discomfort (Mori, et al., 2012). These studies show that

humans do feel more conferrable around robots conforming to natural human

behaviour; to an extent. One of these natural behaviour is to look where we are

Page 15: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 8

going, this can be implemented on a mobile robot to “look” in the direction it intends

to move as a form of navigational intent.

2.3 Robot Navigational Signals

Peters et al., performed a study based around a pass by scenario in a

hallway, participants were asked how they felt about the manner in which a mobile

robot should express navigational intent (Peters, et al., 2011). The navigational

signals used in the experiment were: sideways motion, forward or backwards

motion, stopping, screen signals or camera motion. The results showed that

forwards, backwards, and sideways motion was were favourable with a combine

49% of people preferring them, compared to 22% for screen signals and less than

10% each for the rest. A question in the survey asked users if there was another

way they would like to see navigational intent expressed, 60% of people responded

with indicators. Indicators are a simple but effective and well known medium for

expressing navigational intent that most people will be aware of from transport, it

would be implemented onto a robot for the study.

The second experiment, in a study conducted by Dondrup et al., attempts to

look at the way in which humans interact with a robot in a corridor passing scenario

(Dondrup, et al., 2015). Two participants were involved, both human however, one

was dressed in a robot costume with their eyes and face hidden. The “robot”

received instructions on movement and collision avoidance via a set of headphones.

The other participant was naïve to the end goal of the experiments, they were just

instructed to cross the corridor without colliding with the “robot” and with as little

movement as possible. The robot had a tablet positioned on its chest after the start

of each trial could either display eyes, looking right or left, indicators again right or

left or no signal. At no point during the study did they attempt to deceive the

participant i.e. indicate left and then move right. The robot would also change when

it would signal from either 1s, 1.5s or 2s into the test. The purpose of the study was

to investigate different aspects of HRSI using Qualitative Trajectory Calculus state

chains, however the psychology aspects of the difference in human interaction

during the second experiment weren’t looked at. The authors stated “Some of the

more interesting phenomena in the experiments, especially the “Bristol Experiment”,

like if the indicators had an effect on the interaction between the two agents or if the

timing was important for the use of the indicators, will be investigated in more

Page 16: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 9

psychology focused work.” (Bristol Experiment refers to the experiment spoken

about in this paragraph, the second experiment of the work) The experiment for this

work will attempt to follow the same strategy of the second experiment, keeping as

many factors the same as possible, however for the purpose of this work; only the

various signals will be looked at not the timings as well thus just having one variable.

Two of the behaviours present in this study will be used, the indicators and the eyes.

However, the will be physically implemented rather than using a screen.

3 Design

3.1 Project Management

3.1.1 Traditional

This project will use the traditional project management approach. As the

project is a research project, with a coding, experiment and evaluation it seems

appropriate to use a rigid methodology. As the project is neither client nor user

driven, an agile methodology isn’t needed, as the requirements are unlikely to

change in any significant manner until the entire process is completed. The stages

can be broken down into: Aim & Objectives (Business Requirements), System

Requirements, Design, Development (Build), Experimental Analysis (Test) and

Evaluation (Deploy). The cycles as seen below in Figure 1 is a waterfall style, with

one aspect being completed before the next is started.

Figure 1 - Traditional Project Management (Project Lifecycle Services Ltd, 2014)

Page 17: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 10

Aims and Objectives has already been completed in the project proposal and

are listed previously in the report.

System Requirements follows, in which the literature review is a major

aspect. The review shows what research has been conducted and the gaps in it,

thus hinting at research that needs to be continued. The system requirements for

this project will be bases around an extension on the aims and objectives and gaps

in the literature review. The system requirements are as follows:

1) Implement a way of using indicators to signal navigational intent

2) Implement a way of using the head to signal navigational intent

3) Create a randomized testing strategy capable of handling a non-

specified amount of subject

Design is the next aspect, this will not just include software design; but also

the design of the experiment and the methodologies used in the later evaluation of

the project. The software will be designed around the system requirements, using

established frameworks. The software for this project however is minimal, there is

greater interest in the experiment than the actual software. Thus the design for the

experiment is pivotal. The experiment design will attempt to remove as much bias

as possible as well as running in an ethical, professional and appropriate manner;

by doing this it should allow greater reliability of results attained. Finally, the

appropriate paperwork for the subjects in the experiments must be design, this will

include: consent form, demographic form and post experiment survey. The forms

must include all relevant information as well as being informative and ethically

correct by informing participants of their rights.

Building again is not entirely software based, as the experiment is such a

vital part. The first aspect of the build will be the software development, this allows

a working prototype in either a simulation or a real world situation; giving the basis

for the experiment setup. In addition to the software, additional hardware will need

to be added and mounted to the robot; 5V indicators will be added to allow the robot

to indicate, as well as a rotating head. Additional changes to the software will be

made after the addition of the indicators and head. Following on from the software

and hardware the experiment will be designed, as well as minor changes to the

software to incorporate the location of the study. The experiment will be in a single

Page 18: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 11

self-contained and controlled environment, using the same robot to allow for the

greater accuracy of results. The final stage will be to record data from the robot of

the position of the subject in the experiment.

Experimental Analysis is one of the most important aspect of the research.

All attempts will be made to remove bias from the experiment to ensure a high

quality as possible data set. The experiment will occur in Lincoln Centre for

Autonomous Systems research laboratory over the course of a few days.

Participants will be handled on an individual basis to stop participants who have

experience the experiment sharing their opinions with those that haven’t. This

section also includes coagulating the data formulated in to the experiments to a

format that can later be reviewed; this includes the survey questions and the

positions of the participants from the robot. All the data will be tabulated and

appropriate graphs and paired t-tests etc. will be used to create the final results.

Evaluation is the last element, and starts by discussing and concluding on

the results attained. A key part of the evaluation is to look at the limitations of the

project and ways it could be improved if undertaken again. Most importantly, the

evaluation should preclude to possible extensions that could be looked at with the

knowledge attained from this iteration. Finally a critical reflection in which the

aspects of the project management methodology will be critiqued and reflected upon

as to whether they were appropriate in the context of this project, as well as what

could have been done differently from a project management perspective.

3.1.2 Supervision & Deliverables

To ensure that the project continues as expected and time frames are being

met, regular supervision with the project supervisor will take place. Meetings will

take place on at least a fortnightly basis, thus giving time for decent strides to be

taken, but at the same time not long enough to fall majorly behind the goals. As well

as regular meetings, non-formal contact will be used if any problems arise within the

project that doesn’t require face-to-face time; this will be done using email

predominately and could be for any small matters. However, should a matter arise

that does require extra contact time, additional meetings can be planned on pro re

nata basis.

Page 19: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 12

At various times during the project deliverables will be needed rather than

just communication, a summary of Goals and Time Frames can be found along with

a Gantt chart in Appendix items D and E respectively. The main deliverables will be:

software, evidence of functional software, experiment plan and results. The

software will be hosted on a code revision site that the supervisor will have public

access too, thus can be reviewed at any time. The software will be run in a simulator

environment that can be either viewed at a meeting or a video can be sent to the

supervisor; evidence of functionality will be needed before testing on a real robot.

The experiment plan will be delivered with a physical showcase, including an

explanation of why it has been prepared in the manner shown, and a copy of all the

relevant paperwork that will be used in the course of the experiment. Results will be

shown after the experiment and in the form of graphs and charts thus easier to

understand and digest than raw data. Finally, all documents will be available for the

supervisor to view on a file hosting service, meaning easy feedback can be given

in-between meetings. As well as this a risk assessment and contingency plan list

has been created and can be seen in Appendix item F.

3.1.3 GitHub

GitHub is a Git repository host service. Git hub allows easy source code

management (SCM), code revision, collaboration and code access worldwide (Git

Hub, 2015). From a project management perspective, a code revision system is

invaluable, it will ensure that the code is always kept on the cloud and accessible

anywhere. As well as this, it allows easy access for code to be tested on different

machines and most importantly the robot itself. Finally, it allows easy support for the

project supervisor, they are able to quickly look at the code and what work has been

done in-between meetings, comments can be left and they can trial the code before

using it on a real robot.

3.2 Research Methodology

During the course of this project, the empirical research approach will be

used, empirical research with its distinctive sections ties in well with tradition project

management. “Empirical research is based on observed and measured phenomena

and derives knowledge from actual experience rather than from theory or belief.”

(Cahoy, 2015), empirical research stems from the philosophical medium Empiricism

which is described as gaining knowledge through experience (Duignan, 2015).

Page 20: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 13

Empirical research has been shown to work very well with social robotics and with

HRSI, examples of this can be seen in the work of: Hansen et al., Peters et al.,

Saulnier et a., and more mentioned in the literature review (Tranberg Hansen, et al.,

2009) & (Peters, et al., 2011) & (Saulnier, et al., 2011).

One of the main factors of the empirical research methodology specifically in

HRSI is a direct interaction experiment, for the purpose of this work it will be in a

laboratory environment. This methodology has been proven in many situations such

as work by Young et al., in which they evaluated a dog lead system for robots, with

the robot either walking in front or following the participant (Young, et al., 2011).

Other examples already spoken about in the literature review include: Dondrup et

al. and the robot signalling experiment (Dondrup, et al., 2015), Pacchierotti et al.

and the robot approach experiment (Pacchierotti, et al., 2005).

Empirical research contains the following steps:

1) Observation

Observe by collecting and analysing empirical facts

2) Induction

Formulating hypotheses via Induction

3) Deduction

Deduct consequences with attained empirical data

4) Testing

Test hypotheses with new empirical data

5) Evaluation

Evaluate the outcome of testing

Figure 2 - Empirical Research Cycle (Explorable, 2015)

Page 21: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 14

Observation for this research has been covered by doing an in depth

literature review of projects surrounding the area of navigational intent and the use

of signals from mobile robots. Various projects have been studied and analysed,

each with a slightly different view point on the research goal, thus giving a deep yet

broad underpinning to the aims of this project. The literature review has shown that,

humans do much prefer a signal of navigational intent from a robot in various

scenarios, as well there is a maximum and minimum “natural” distance and speed

a robot should use when moving around humans. What hasn’t been addressed in

much detail however, is the specifics of how different signals alter the perception,

understanding and comfort for humans regarding robot navigation intent, this is the

main area of concern for the project, to attempt to distinguish between various

signals.

Induction is using what has been seen in the literature review and forming

specific hypotheses from the general view. To form the hypotheses, general

principles seen in the review must be taken. The generalized principle from the

review is that humans prefer a signal of navigational intent. Thus, hypotheses can

be drawn from these principles.

The hypotheses drawn from the literature review are:

1) Humans feel more comfortable and are able to quickly and correctly

understand the intention of a robot when using indicators to express

navigational intention compared to that of no signal

2) Humans feel more comfortable and are able to quickly and correctly

understand the intention of a robot when using its head to express

navigational intention compared to that of signal

3) Humans will move further away from the robot when it is indicating

navigational intent than when using no signal

4) Humans will move further away from the robot when it is using its head

to express navigational intent than when using no signal

Using the general principle and induction, it has previously been shown that

humans prefer navigational signals from a mobile robot. Therefore, humans would

probably prefer the robot to use indicators and a head movement to no signal, as

both of these are signals of navigational intent.

Page 22: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 15

Deduction is where aspects relevant to the study are conceptualize and the

formation of the experiment begins. Humans are exposed to indicators on a regular

basis when either on the road in a vehicle or walking. Humans understand that when

a vehicle indicates left, it intends soon to turn left. From this we can deduce the

same principle and apply it to the robot, by applying indicators we can assume that

when a robot indicates left it intends to turn to the left and vice versa. The same can

be deduced regarding human navigation, humans generally look in the direction

they are walking and intent to move, this can also be applied to robotics. We can

apply a head movement to the robot moving its head and eyes to “look” at the

direction it intends to travel, this should be deducted by the human as the robot

looking in the direction it intends to travel.

Testing will be completed by an experiment. Participants will complete a set

of trails, interacting with the robot in a controlled environment. All efforts will be made

to reduce the bias in the experiment to as low as possible. As well as this it will be

attempted to get a diverse test data set, specific aspects that will try to be attained

are a spread of ages, gender and knowledge of robotics. Furthermore, to attempt to

lower any bias the order of the test will be randomized for each participant with only

the controller know what will happen during the experiments. The data obtained in

the experiments will be quantitative and qualitatively examined, then tested against

the hypotheses formulated in the induction. Data will be attained as a mixture of

software capture from the robot and opinions of the participants in the form of a

survey.

Evaluation starts by looking at the graphs and charts formulated by the data

collected in the experiment, these will created as using means standard error of

means, and mode; including other statistically accepted formulas. As participants

will complete all of the seven tests, a paired t-test will be used to see if there is a

statistically significant difference between the three behaviours. The hypotheses will

be critiqued as well as the testing strategy. The evaluation should reference back to

the literature review and discuss if findings collaborate with the research of the

previous authors, as well as highlight the differences between the works. There will

be a discussion into what the possible connotations of the findings (if any) are as

well as looking at possible improvements if a second iteration to take place; finally

Page 23: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 16

looking into the possible extensions of the project and further research that could be

undertaken as a direct extension to the project.

4 Software Development

4.1 Tools & Machine Environments

4.1.1 Robot Operating System

ROS is used for a variety of tasks: HERE maps (Delaney, 2014), Aerospace

Research (Foote, 2013), Industrial Manipulators (Intermodalics, 2015), amongst

others. Carrying out a research project, with real world connotations, it is imperative

to attempt to use established mediums to ensure reliability, it is clear to see that

ROS is not only a proven system but an adaptable one with more than enough

functionality for this project.

“ROS (Robot Operating System) provides libraries and tools to help software

developers create robot applications. It provides hardware abstraction, device

drivers, libraries, visualizers, message-passing, package management, and more.”

(Willow Garage, 2015). ROS is a peer-to-peer robot middleware package, it allows

easier hardware extraction and code reuse. All major functionality is extracted into

separate nodes, which typically run in a separate process. Communication

4.1.2 Ubuntu

The main ROS client libraries are created in accordance with Unix-like

systems, predominately due to the open source nature of the software

dependencies (ROS, 2015). In light of this the only supported operating system is v

various Ubuntu versions dependant on the version of ROS. It is important to note

that while ROS is available for Windows and Mac, they not supported and thus are

maintained by the community, this is one of the key reasons that Ubuntu was the

chosen Operating System for the development of the software. Ubuntu is also the

preferred and supported OS for the industrial version of ROS; ROS-Industrial (ROS-

I), “Like ROS, ROS-I nominally runs on Ubuntu Linux…” (ROS Industrial, 2015).

4.1.3 Python

The coding language used for the purpose of the experiment was python.

With ROS only two languages are available, C++ and Python. Due to the nature of

Page 24: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 17

this experiment, the coding aspect is of minimal importance and the advanced

functionality of C++ isn’t required. Although Python will take time to learn; is an

interpreted language and thus doesn’t require to be complied each and every time

it is run and tested, this will time will make up for the added time to learn. Alongside

this, it terms of ease of understanding Python is far simpler as it more similar to

simple English. All resources need for the project are available in Python, as well as

a large set of tutorials and how to help on wiki ROS and other such similar websites.

4.1.4 Robots

The university have various different appropriate robot systems available that

could be used for this project, those considered for the project are: Rovio, Turtlebot,

Pioneer 3-AT, MARC (based on inMoov robot) and Scitos G5. The robot required,

will need to be mobile, capable of using indicators and a head movement to display

navigational intent. The robot should also pose enough of a physical obstacle so

that participants are required to circumnavigate the robot and not just walk over it.

Only robots with a camera have been looked at, this is due to needing a way for the

robot to track the position of the subject during the tests.

Rovio is a small mobile webcam, with a speaker,

microphone and articulating head capable of three different

positions, and is roughly shin height. While it is mobile, there

is no easy way to attach the desired indicators to the robot.

Another added issue although the head can move; it only

moves in the vertical axis, which wouldn’t be an intuitive way

to express navigational intent. Thus this would have to be explained to participants

before the experiment and giving away vital information about the nature of the

experiment. Finally, the robot is far too small in stature, it could easily be walked

over, for all these reasons it won’t be used (WowWee, 2015).

Figure 3 – Rovio (Robotdom, 2015)

Page 25: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 18

Turtlebot is an open source and hardware robot. Due to

the nature of turtlebot, it is highly customizable. The University

has turtlebots powered by an i5 Intel Nuc, the Nuc is sufficient

to add the capabilities of indicators to the robot. Additional

hardware in the form of an articulating head would however

need to be added to the robot to make it suitable for the tasks.

With turtlebot only being approximately knee, it is another

robot that could be walked over instead of circumnavigated.

The robot has a Kinect on board that would be able to track

people in 3D space. Due to the additional hardware and size this robot wouldn’t be

appropriate to use for the experiment.

MARC (Multi-Actuated Robotic Companion) is a 3D

printed robot based on the open source inMoov project by

Gael Langevin (Langevin, 2015). MARC has a fully

articulated head, fingers and arms, with speaker, a

microphone, chest mounted Kinect camera, two cameras in

the eyes and is capable of people tracking in 3D space.

Being the size of a small person, MARC is not something that

can be walked over. MARC is capable of moving the head

with 120o of freedom and being open hardware based on a

Pololu electronics, indicators could easily be added. The

limiting factor is currently MARC is not mobile as the legs are still being designed by

Gael. A possible solution to this could be to place MARC on wheels, however this

isn’t a very elegant design and could subtract from the end perspective of the

participants, being very unnatural looking.

Pioneer 3-AT is a highly versatile four wheeled robotic platform.

Similar to the turtlebot, it is highly customizable and able to add

additional hardware. Out of the box the platform doesn’t contain

the required hardware, a 3D camera similar to the Kinect,

indicators and a articulated head would have to be added,

although this is possible, it is rather a lot of hardware work.

Included with this, the robot base is only shin height meaning to

add a head to it, substantial fabrication would be needed to

Figure 5 – inMoov (3diot, 2014)

Figure 4 – Turtlebot (Turtlebot, 2015)

Figure 6 - Pioneer 3 – AT (Unmanned Vechicle Centre, 2015)

Page 26: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 19

make adding an articulated head to the system. So although the system is rugged

and highly customizable, it is too much additional work for the purpose of a project

like this.

Scitos G5 is a robot with a full working computer and touch

screen. Including additional hardware G5 stands at the height of a

small person; meaning participants would have to circumnavigate.

Included in the additional hardware is a head unit in which the eyes

are capable of full 360o rotation. Various voltage ports are also

available to power indicators. The robot also has two mounted 3D

cameras capable of tracking participants in 3D space. It’s because of

these reasons, along with needing little additional hardware work that

the G5 has been chosen.

The key aspects for each robot can be seen in Figure 8.

Rovio Turtlebot MARC Pioneer 3-AT Scitos G5

Mobile ✔ ✔ X ✔ ✔

3D Camera X ✔ ✔ X ✔

Rotating Head Up and Down X ✔ X ✔

Ports for Indicators X ✔ ✔ ✔ ✔

Large Presence X X ✔ X ✔ Figure 8 - Robot Comparison Matrix

4.1.5 SCITOS G5

Scitos G5 (aka Linda) robot with a Human Machine Interface enclosure

manufactured by Metralabs Germany: Scitos G5 is about 1.5m (ca. 5ft) tall and

weighs about 75kg (ca. 165Lbs.). The Scitos G5 conforms to the European CE-

guidelines for the public indoor sector and was certified by the German Technical

Inspection Agency (TÜV). The Scitos G5 does not have any end-effectors and is

therefore only able to interact with humans via spatial movement, eye gaze, speech

and a touch screen (MetraLabs, 2015).

Due to other software running on the robot created by the STRANDS team,

it isn’t possible for the robot to actively collide with the human, it will stop, turn around

and attempt to find a new path if the human gets to close; for the purpose of this

experiment, this functionality will more than likely not be used. However it does

Figure 7 – Scitos G5 (Lincoln Centre for Autonomous Systems, 2015)

Page 27: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 20

improve the safety aspect of the experiment. The controller of the experiment also

has an overriding controller for the robot, so that if unexpected or bizarre behaviour

occurs it can be remotely controlled and moved.

4.2 Development Methodology

4.2.1 Incremental Development

The software development, has three main aspects indicate, move head and

no signal. Thus it makes sense to deal with these individually, by developing one,

testing and refining it before moving on to the next aspect. A good way to deal with

this is using incremental development.

Figure 9 - Waterfall vs Incremental (Bittner, 2006)

Figure 9 shows an overview of the waterfall and iterative process, waterfall is

described as ‘Taking an extreme waterfall approach means that you complete a

number of phases in a strictly ordered sequence: requirements analysis, design,

implementation/integration, and then testing.’ By Kroll of IBM (Kroll, 2004). The

strictly ordered sequence is useful to the needs of this project, however agile

principles of recurring development would also be useful. Using the incremental

development style, the waterfall process can be followed for each of the different

behaviours, i.e. design, code, integrate and test no signal, before completing the

same process for indicators and then move head.

By implementing this methodology, it gives the rigidness of waterfall to

ensure that each behaviours has been implemented correctly and tested with

Page 28: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 21

enough time before the experiment. However, by using incremental principles, it

allows the requirements to change, i.e. if it is found during the development phase

that there may be an issue implementing one of the three current behaviours, a new

behaviour could be designed and implemented in time for the experiment and thus

not allowing the time frames to be too drastically altered.

4.2.2 Action Lib

The action lib stack, will be the main way in which the system will publish new

goals to move base using ROS. Two scripts are required as can be seen in Figure

10, one to act as the client that sends the goals and the other a server that deals

with the publishing of the goals.

Figure 10 - Client-Server Interaction (ROS, 2015)

Before any scripts are created, the action, feedback and results messages

for the server must be made, all of these will be Pose Stamped messages,

containing a status, map, position and orientation for the robot etc.

The server will be responsible for sending goals and receiving feedback from

move base and will publish results back to the client. Once the server receives the

goal, the goal will be published to move base, the server will then wait for move

base to be within 70cm of the published goal (using the Euclidean distance d(a, b)

= √(a1 – b1)2 + (a2 – b2)2 ), or if the goal has been completely reached. Once the

threshold has been met, the client will be informed and the current position of the

robot be sent back.

The client is where most of the functionality takes places. The first aspect is

to check what number participant the client last dealt with, found by opening a CSV

files containing the number of the last participant. Following this, the relevant CSV

Page 29: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 22

file for the participant is opened, containing the list of behaviours the robot will exhibit

in a randomized order. Each behaviour will have a related method in the client file,

this will contain all of the movement commands, and signal commands as well as a

wait at the end of behaviour, and this allows the participant to be back in place before

the next behaviour starts, which is controlled by the experiment controller.

Figure 11 - ROS Topic Graph

Figure 11 shows the ROS Topics used by the implemented code.

move_server first receives a goal from move_client then publishes it to move_base.

move_base then waits for a result from move_base before sending the result back

to move_client. move_server can also give feedback to move_client as to where the

robot is. move_client can also cancel the goal using move_server/cancel. Finally the

orientation is set by move_client using the DWAPlannerROS/parameter_update

dynamic reconfigure.

The indicators are powered by two of 5V ports on the robots PCB, to blink

the indicators the ports need to be reconfigured on the fly to power on and off. This

is done using the dynamic reconfigure (ROS, 2015) client, which allows a node’s

parameter to be changed without having to restart the node. Dynamic reconfigure is

also used to stop the robot attempting to set perfect orientation when it is moving

between the goals, by setting the orientation tolerance to 2π (360o). Finally, the rate

at 2Hz is controlled on a separate thread in which a sleep of 2Hz is applied before

turning on and off the indicator respectively. The second behaviour, move head is

controlled by sending a joint state message to topic head command, this allows a

pan and tilt value to be sent and reposition the head into its respective position.

4.2.3 Testing

Testing was completed using two different mediums, 3D simulation

environment Blender and the robot. By having a simulated testing environment, it

ensure that any small changes are able to be tested quickly and easily, also useful

for when the robot was unavailable. Some testing will take place on a real robot

Page 30: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 23

close to the time to the user study to ensure it is working as expect on real hardware

as well as simulated. All the testing completed was regarded as black box testing,

due to the software having nothing in the way of inputs, the test was simply to test

and ensure functionality of the robot was safe and got to the correct places.

5 Experiment

The main aspect of the experiment is to explore how various navigational

signals impact the way a human responds both physically and mentally to an

oncoming robot in a corridor passing scenario. The first aspect is to design and

implementation of a suitable and feasible experiment. Following this conclusions will

be drawn from the results of the experiment, as well as a discussion about how the

experiment was executed.

5.1 Design

5.1.1 Physical

The experiment was set up in Lincoln Centre for Autonomous Systems (L-

CAS), part of the University of Lincoln. The robot was position at one end of the

room 8 meters away from the participant facing one another, for each test the start

locations were the same.

Participants were asked to cross the room and arrive at the start location of

the robot with in a manner they felt comfortable with, however with the caveat that

they must start walking directly towards the robot. At the same time the robot would

move in the opposing direction, thus the robot and participant were on a head-on

collision path. Each participant completed seven different test, each with a different

behaviour. 4/7 test included signals of navigational intent (see Figure 12 and Figure

13 for signals).

Page 31: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 24

5.1.2 Robot Setup

The robot could display three signals of navigational intention: no signal,

indicate and move head. No signal, involved the robot moving between the points

without any visual signals. For indicate, 5V LED amber indicators installed (Figure

13) on either side of the “body” just below the “head”, when the indicators were

active, they would flash at a rate of 2Hz. Finally, move head involved moving the

“head” 35o to the left or right dependant on the behaviour needed (Figure 12). The

robot started each behaviour on a keyboard input, for the straight behaviour, the

robot moved 7.5m forward, and this is the only test in which the human would

actively have to circumnavigate the robot without a “collision”. For all other

behaviours the robot started by moving 1.5m forward before starting to signal, with

Figure 13 – Indicate Behaviour: Left, Straight, Right

Figure 12 – Move Head Behaviour: left, Straight, Right

Page 32: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 25

the exceptions of no signal, then a further 1m forward, followed by a 1.8m diagonal

movement (1m sideways and 1.5m forward), the signal was “reset” after this. Next

the robot moved 3m forward before reaching its terminal destination. This means

the robot could take three different paths, regardless of signal with the same start

position but different termination positions (Figure 14 shows available robot paths).

In total there are seven independent behaviours the robot can use and each

will be used on each participant in a randomized order. The behaviours are: move

left, move right, move straight, move left with head, move right with head, move left

with indicator and move right with indicator. It is important to note that when the

robot exhibited the various behaviours the robot indicated its own navigational intent

direction and it was not a command to the human. As well as this the robot did not

deceive the human in any of the behaviours i.e. it never indicated left and then

moved right.

5.1.3 Participants

Ten participants completed the experiment, although none had any

experience or idea of the nature of the experiment beforehand, half of the

participants had previous contact that day with the robot by participating in a

separate experiment. Each participant was issued with a general demographics

form and a consent form to read through and fill out before starting the experiment

(items A and B in appendices). Alongside the forms, the participants were verbally

informed that they had the right to withdraw at any point including after the

experiment and all data will be removed, and that all of their data was given

anonymously and couldn’t be traced back to them. The participants were also

informed they could ask questions before or after the experiment but not during as

to keep the integrity of the experiment.

Figure 14 - Available Robot Paths

Page 33: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 26

At the start of the experiment each participant was informed they were to

stand in the starting position and wait for the controller to instruct them to move.

Once they were instructed the test had begun, they were to walk casually towards

the robot and to navigate to the start point of the robot in as casual and natural

manner as possible. The participants were also told that although the robot would

not collide with them, its behaviour was not reactive to how they acted.

After the participant had completed all seven of the behaviours they were

issued with a survey form to fill in about their experience with the robot (item C in

appendices). The experiment was then explained, including what the purpose of the

research was, why it was being done, and what it was hoped the experiment would

show. Finally, any questions they had could be asked and answered before thanking

them for their time and reminding them of their rights.

5.2 Results

With ten participants completing the experiment, seventy unique data sets

were attained. The results from the survey are tabulated according to behaviour

exhibited and question, these can be seen below in Figure 15. The behaviours are

No Signal is NS, Indicator is I and Move Head is MH. The questions asked were:

1) “I was able to understand the intention of the robot when using this

behaviour.”

2) “I felt comfortable passing the robot while it was exhibiting this behaviour.”

3) “I was quickly able to understand the intention of the robot.”

Questions were answered using a Likert Scale: 1 - Strongly Disagree, 2 -

Disagree, 3 - Neutral, 4 - Agree and 5 - Strongly Agree.

Participant NS

1

NS

2

NS

3

I 1 I 2 I 3 MH

1

MH

2

MH3

1 2 3 2 4 4 3 4 4 4

2 2 2 2 4 4 5 4 3 4

3 1 2 4 5 4 5 3 2 4

4 2 3 1 5 4 5 4 4 4

5 2 2 3 4 2 4 4 5 4

6 3 4 2 5 5 5 4 4 3

Page 34: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 27

7 3 4 3 4 4 4 2 4 3

8 4 3 4 5 4 5 2 3 1

9 3 2 4 4 4 4 3 3 3

10 1 1 1 4 5 4 5 5 5

Figure 15 - Survey Results

As well as this, the robot tracked the position of the person, in 3D space,

using itself as the origin. For the interest of this experiment only the X and Y axis

are of need as they remained on the same height. The exact distance from the robot

will be used, using simple Pythagoras 𝑎2 + 𝑏2 = 𝑐2. The area of interest will be using

a field of view from 13

4 to

1

4 π, with π directly behind the robot and 0 directly in front,

the area of interest can be seen in Figure 16.

Figure 16 - Robot Area of Interest for Minimum Distances

Figure 17 shows minimum distances kept each test for each participant, unit

used is meters.

Participant Straight Left Right Indicate Left Indicate Right Head Left Head Right

P1 0.72647 0.75379 1.29989 1.62642 1.00610 1.16366 1.15255

P2 1.54390 1.50569 1.53723 1.61397 1.59526 1.19771 0.73154

P3 0.67134 0.57304 0.90999 1.68446 1.31759 0.69555 0.52814

P4 0.72808 0.64711 0.85181 1.35578 1.34077 1.02913 1.14908

Page 35: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 28

P5 2.41629 1.60650 1.33606 2.59011 2.44219 0.65425 1.13071

P6 1.11463 1.05780 1.29263 1.31898 2.17899 1.52792 2.16402

P7 1.25423 1.31454 1.42037 1.77986 1.92819 0.65792 1.07275

P8 0.61908 0.48311 0.48145 1.74640 1.04203 1.24667 0.78165

P9 0.74319 0.52994 0.33715 0.85010 0.86872 1.50603 1.70050

P10 0.47898 0.72195 0.99406 1.03909 0.99406 1.65893 1.06612

Mean 1.02962 0.91935 1.04606 1.56052 1.47139 1.13378 1.14771

Figure 17 - Minimum Distance from Robot (m)

5.3 Evaluation

5.3.1 Questionnaire Sections 1, 2 & 3

The first aspect of the evaluation is to look at the results of the surveys, the

results to the question 1, 2 and 3 for each behaviour can be seen below in Figures

18, 20 and 22, including standard error of mean error bars and the results of paired

t-test between No Signal and Indicators.

Figure 18 - Results Question 1

The first question looks at how well the participants could understand what

the robot was intending to do in each of the behaviours. Just by looking at the

graphs, indicators were most understandable, with the highest mean and lowest

standard error of mean, with the mean being 4.33 ± 0.17 and mode of 4. The second

Page 36: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 29

most understood was the move head movement behaviour, with a mean of 3.44 ±

0.27 and mode of 4. Finally the least understood behaviour was no signal, with a

mean of 2.22 ± 0.28 and mode of 2.

There was extreme significant difference in the scores for Indicators

(M = 4.33, SD = 0.52) and No Signal (M = 2.22, SD = 0.95) with

conditions; t (9) = 6.68, p = >0.0001

There was not quite a statistical difference in the scores for Move

Head (M = 3.44, SD = 0.97) and No Signal (M = 2.22, SD = 0.95) with

conditions; t (9) = 2.17, p = 0.0584

There was a statistical significant difference in the scores for Indicators

(M = 4.33, SD = 0.52) and Move Head (M = 3.44, SD = 0.97); with

conditions t (9) = 2.38, p = 0.0414

Figure 19 - Score Differences for Question 1

These results suggest that indicators do affect humans understanding the

navigational intent of a robot compared to no signal and head movement this can

be seen in Figure 19 showing the score difference between no signal and indicators

for each participant. However there was there was no significant affect on the

understanding between head movement and no signal. Specifically, these results

suggest that robots using indicators to display navigational intent are more

understood by humans than those using a head movement or no singal, and robots

0

1

2

3

4

5

N O S I G N A L I N D I C A T O R S

PARTICIPANT SCORES DIFFERENCE FOR QUESTION 1

Page 37: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 30

using a head movement to display navigational intent are not significantly more

understood that those using no signal.

Figure 20 - Results Question 2

The second questions looks at how comfortable the participants felt when

passing the robot in each of the behaviours. Again participants selected indicators

as the most comfortable behaviour exhibited with the lowest standard error of the

mean; at mean 3.89 ± 0.24 mode of 4. Followed by closely by move head behaviour,

with a mean of 3.67 ± 0.28 mode of 4. Finally, no signal is lowest of the behaviours,

with mean 2.44 ± 0.35 mode of 2.

There was a very statistically significant difference in the scores for

Indicators (M = 3.89, SD = 0.82) and No Signal (M = 2.44, SD = 0.97)

with the conditions; t (9) = 3.77, p = 0.0044

There was no statistical significance in the scores for Indicators (M =

3.89, SD = 0.82) and Move Head (M = 3.67, SD = 0.95) with the

conditions; t (9) = 0.71, p = 0.4961

There was a statistical significance in the scores for Move Head (M

= 3.67, SD = 0.95) and No Signal (M = 2.44, SD = 0.97) with the

conditions; t (9) = 2.54, p = 0.0318

Page 38: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 31

Figure 21 - Score Differences for Question 2

These results suggest that indicators and head movement do affect humans

comfort when passing a robot compared to using no signal this can be seen in Figure

21 showing the score difference between no signal and indicators for each

participant. However there was no statistical significance in the affect for using

indicator compared to head movement. Specifically, these results suggest that when

a robot is display navigation intent with either indicators of a head movement

humans are feel more comfortable passing the robot then not displaying any signal,

also humans do not feel significantly more comfortable passing a robot using

indicates instead of head movement.

0

1

2

3

4

5

N O S I G N A L I N D I C A T O R S

PARTICIPANT SCORES DIFFERENCE FOR QUESTION 2

Page 39: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 32

Figure 22 - Results Question 3

The third question looks at how quickly people could understand the intention

of the robot, participants also selected indicators as the quickest to understand the

intention of the robot; with a mean of 4.33 ± 0.23 mode of 5. Second again was

move head with a mean of 3.56 ± 0.34 mode of 4 and finally no signal with a mean

of 2.67 ± 0.35 mode of 2.

There was a very statistically significant difference in the scores

between Indicators (M = 4.33, SD = 0.70) and No Signal (M = 3.56,

SD = 1.17) with the conditions; t (9) = 4.32, p = 0.0019

There was not quite a statically significant difference in the scores

between Indicators (M = 4.33, SD = 0.70) and Move Head (M = 3.56,

SD = 1.08) with the condition; t (9) = 1.96, p = 0.0811

There was not a statistically significant difference in the scores

between Move Head (M = 3.56, SD = 1.08) and No Signal (M = 3.56,

SD = 1.17) with the condition; t (9) = 1.41, p = 0.1934

Page 40: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 33

Figure 23 - Score Differences for Question 3

These results suggest that indicators do affect how quickly humans can

understand the navigational intent of a robot compared to using no signal or head

movement signal this can be seen in Figure 23 showing the score difference

between no signal and indicators for each participant. However, there was no

significant difference between how quickly the robot was understood between head

movement and no signal or indicators and move head. Specifically, these results

suggest that humans are able to understand the navigational intent of a robot using

indicators instead of no signal significantly quicker. Finally, the results also suggest

that humans aren’t able to understand navigational intent significantly quicker using

head movement instead of no signal or using indicators instead of head movement.

5.3.2 Questionnaire Section 4

The first question of this section was asking which of the behaviours the

participants would most like to see implemented as standardized behaviour for

robots displaying navigational intent. Six people answered with indicators and four

with head movement. However, all of the participants who listed their experience

with robotics as little or none also listed head movement, a few stating “as it is the

most natural”, whereas all who listed their experience with robotics as some or more

all choose indicators with statements such as “indicators as it is a well know

framework for changing and could have the most public understanding” or

0

1

2

3

4

5

N O S I G N A L S P E E D I N D I C A T O R S S P E E D

PARTICPANT SCORES DIFFERENCE FOR QUESTION 3

Page 41: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 34

“indicators as they can be seen at a glance”. The clear split in opinion dependant on

experience with robotics is an interesting point that arose, the experience with

technology however, seemed to have no over bearing factor on the response to the

question and similar to that of the work by Hall et al. who found a bias in peoples

comfort with robots will self-confessed experience (Hall, et al., 2014).

Second question asked if the participants could think of an alternative signal

of navigational intent they would prefer to those trailed in the experiment. Two of the

participants stated they would like to have a verbal signal as well as visual, one

stating “… also more accessible to the visually impaired”. One participant stated

“have a standard side, like on the UK roads (robots could) pass on the left”. Lastly

one person stated “(have a) small sideways movement before turning”. The point of

this experiment was to look into non-verbal was of expressing navigational intent,

but a verbal and non-verbal method could be used, and two of the participants feel

this would be beneficial, this is similar to the finding of Peters et al. who found some

participants would have like a verbal signal instead (Peters, et al., 2011). The

response about passing on one side is an interesting notion and not one that was

thought of for this experiment, it would however require a consensus in a large

population to be useful, also for the purpose of this experiment it wouldn’t have been

used as it would have required explaining more about the experiment to participants.

Finally, a small sideways notion again wasn’t consider in the experiment, but could

be researched, a conversation with the participant after concluded that it wouldn’t

be an ideal solution as it isn’t “obvious” enough in that people may not notice it they

were distracted.

The final question asked if any additional actions by the robot was noted in

any of the trails. Three participants provided an answer to this question: “sometimes

the robot reacted quicker to my presence than in others”, “she (Linda the robot) went

too far in one direction to avoid me” and “it felt like she followed me for some time

on the 6th test”. For the first two questions, it was explained to the participants after

the experiment that the robots behaviour was not reactive and the only time it would

become reactive was if they got extremely close to the robot, it would attempt to

circumnavigate them rather than follow its path. This happened for the final answer,

the robot went right (using no signal) and the participant did the same, once the

robot got two close it tried to plan a new goal and went around 360o on the spot the

Page 42: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 35

same direction as the participant, giving the perception they were being followed.

After these behaviours were explained, the participants had no additional points to

bring up.

5.3.3 Robot Data

Figure 24 - Average Minimum Distances

For the mean distances, each behaviour has been looked at rather than each

test i.e. Indicators is indicate left and right etc. this way biased can be lower of a

person circumnavigating further to the left than the right for any reason. The results

show that participants kept furthest away from the robot during the indicate

behaviour with a mean distance of 1.52m ± 0.11, flowed by the head at 1.14m ±

0.09 and finally no signal at 0.98m ± 0.09. The straight behaviour was also include

in the experiment, but wasn’t include in any of the behaviours, the average distance

kept was 1.02m ± 0.13.

T tests were performed on the corresponding data, with left and right tested

against each other e.g. head left & head right and indicate left & indicate right but

will be listed as move head and indicate.

Page 43: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 36

There was an extremely statistically significant difference in the

distances between Indicators (M = 1.52, SD = 0.50) and No Signal (M

= 0.98, SD = 0.41) with the conditions; t (19) = 5.78, p = >0.0001

There was a statistically significant difference in the distances

between Indicators (M = 1.52, SD = 0.50) and Move Head (M = 1.14,

SD = 0.42) with the condition; t (19) = 2.38, p = 0.0277

There was no statistically significant difference in the distances

between Move Head (M = 1.14, SD = 0.42) and No Signal (M = 0.98,

SD = 0.41) with the conditions; t (19) = 1.11, p = 0.2801

These results suggest that indicators do affect significantly how close

humans get to a robot while passing, compared to using no signal and head

movement, however using a head movement instead of no signal there is no

statistical significance. Specifically, these results suggest that humans will pass

further away from the robot when it uses indicators to signal navigation intent

compared to no signal or head movement, and that passing distance is not

statistically different comparing head movement and no signal.

Figure 25 - Behaviour Distribution

Figure 25 shows a box and whiskers diagram for the minimum distances for

each behaviour. It can be seen that the ranges of Move Head and No Signal are

Page 44: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 37

very closely related, with a slightly high range for move head, the interquartile range

(IQR) is much the same with a slightly more contained IQR for No Signal slightly

more contained, both have a similar medium. On the other hand, the plot for

Indicators is quite different, the is similar, however, with a significantly higher

maximum and lower minimum compared to move head and no signal, the same can

be said about the IQR which is again much the same size but shifted to the right.

However Q3 and maximum are much further from the medium for indicators,

showing there is a large spread in the larger 50% of the data compared to the

smaller 25%.

5.3.4 Summary

The hypotheses previously stated were:

1) Humans feel more comfortable and are able to quickly and correctly

understand the intention of a robot when using indicators to express

navigational intention than no signal

The new empirical data gained supports this hypothesis. There is statistically

significant difference in the comfort humans feel as well as the speed and ease they

understand the navigational intention of a mobile robot using indicators compared

to no signal. It also appears to be a positive difference, in the mean a mode scores

for indicators for each question being higher than that of no signal.

2) Humans feel more comfortable and are able to quickly and correctly

understand the intention of a robot when using its head to express

navigational intention than no signal

The data gained, neither supports nor disproves this hypothesis. There is a

statistically significant difference in the comfort felt for participants between move

head and no signal, which appears to be in favour of more comfort felt during the

move head behaviour due to the higher mean and mode score. On the other hand,

there is no significant difference in the scores for the t-test between the move head

and no signal behaviours, this trend continues into the mean scores, although the

no signal mean is lower for both questions, it is a negligible amount, unlike the mode

which again remains higher with two points more.

Page 45: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 38

3) Humans will move further away from the robot when it is indicating

navigational intent than when using no signal

The experiment data supports this hypothesis, the t-test value shows

significant differences in the distances kept from the robot to in the behaviours of

indicators and no signal. Specifically, the data shows that humans kept a greater

average distance from the robot when using indicators to show navigational intent,

the mean is significantly greater for indicators as. Furthermore, the box and whisker

plots shows that the spread is shifted to the right with a larger upper range of

minimum distances kept during the indicate behaviour.

4) Humans will move further away from the robot when it is using its head

to express navigational intent than when using no signal

For the final hypothesis, the data again neither supports nor refutes the

hypothesis. There is no significant difference in the minimum distance kept between

no signal and move head. The means again are very similar, although the mean for

no signal is lower again, the difference is not a significant one. The box and whiskers

diagram, shows how similar the data really is, with near on a carbon copy for the

ranges, medium and interquartile range.

Following on from the experiment it is suggested that the favoured behaviour

for: ease of understanding, speed of understanding and comfort felt of is indictors,

followed by move head and lastly no signal; which appears to conform with the

findings of Peters et al. although they didn’t implement the indicators a large amount

of their user group suggested them (Peters, et al., 2011). This is based on the mean

statistics for the post experiment survey. The order is also the same for the

minimum distances kept, with participants staying furthest away while the robot was

indicating followed by moving head and lastly no signal.

5.4 Limitations

One limitations during the research is the size of the experiment, with only

ten participants only seventy new sets of data have been acquired. As well as the

experiment missing depth of participants it is also missing breath, with 9/10

participants being male and again 9/10 being students, taken as a pilot experiment

interesting information has been acquired and is appears as a good starting point

Page 46: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 39

for further research. Furthermore, half of the participants had previous contact with

the used robot that same day, this may influence how comfortable the person felt

around the robot if that had previously contact that day. However, the contact was

not related to the purpose of this experiment, although half of the participants had

previous contact with the robot, they had no additional information on the nature,

purpose or requirements of this experiment, but again with the statement of Hall et

al. this experience may be enough, more so with using the same robot to cause bias

(Hall, et al., 2014).

A major limitation is that some key aspects of comfort in HRSI as defined by

Kruse et al., including speed and acceleration are not considered and left constant.

Comfort is a very specific feeling and has been shown to be greatly be affected by

various factors, whilst trying to keep as many factors constant as possible, it may

have had a negative effect in that one behaviour may be preferred at the current

speed or acceleration used rather than in general (Kruse, et al., 2012). The same

can be said for the robot used in the experiment, using another robot the effects of

various behaviours may have had a different effect on the comfort felt by the

participants.

Another limitation stems from the hardware of the actual robot and its

positioning. The perception and understanding may be effected from the positioning

of the hardware and the way it signals. To remove as much biased as possible the

indicators were made as “standard” as possible; this was achieved by using similar

size indicators to that of a motorcycle, at a rate of 2Hz using an amber light. The

head behaviour only rotates the internal of the head; again to attempt to remove

biased, the indicators were mounted as close to the head as possible, as not to

cause people to only look for indicators.

Before the experiment, participants were instructed to change positions with

the robot in as “casual and natural manner” as possible. This statement along with

the setup of the laboratory may have led to difference in the way participants

interacted with the robot in terms of naturalness of interaction in a given direction.

However, looking at Figure 10, the average distances kept from the robot during the

left and right trails for each behaviours, aren’t significantly different. A similar

limitation was noted by Dondrup et al., however their conclusion were also that the

Page 47: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 40

message to participants before the start of the study showed a negligible effect on

results (Dondrup, et al., 2015).

One major limitation came from the speed at which some participants

completed the first trail in their test set, due unfamiliarity with the robot. On two

occasions the participant had walked with such velocity that they had passed the

robot before it had made its navigational movement. On the other hand there were

occasions when participants walked with so little velocity that the robot had

completed its path change before the participant was near the robot. However, the

order of the trails were randomized, so this should have mo major swing on one

particular test set.

Finally, a large part of the data is based on information from the

questionnaire, of which a Likert Scale is used. Using a Likert scale it is important to

remember that the difference between each value may not be equally weighted in

the views of the user. E.g. is the difference between ‘Strongly Agree’ and ‘Agree’

the same difference between ‘Neutral’ and ‘Disagree’. However the Likert scale is a

commonly used medium that many participants would have seen at some point

before. The use of a five point scale also reduces the amount of swing that people

can feel compared to a larger scale.

5.5 Future Work

The first extension to this work would include attempting the experiment again,

with a more refined testing strategy as learnt from this iteration. The experiment

would need to contain a significant increase in participation, thus allowing for a

greater confidence in any results gained. Another requirement would be a more

diverse participant group, with specific interest regarding variance in self-defined

robotics knowledge.

Another possible extension to the project could be the implementation of addition

signal(s). This work looks at using three signals, extensions could include looking at

using different signals. One could be using a screen with images of an arrow pointing

in the direction of intended direction. A second as learnt from the user survey could

be the addition of an audio signal, this has the potential to open many paths e.g.

what is the most preferable audio signal (also suggested in study by Peters et al.)

“I’m going left now”, “please go left human” etc. or the possibility of blending with a

Page 48: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 41

non-verbal method such as adding clicking to indicator or a simulated loud motor

noise to show turning (Peters, et al., 2011). A third option could be a sideways

movement before then intended path change, again similar to that implemented by

Peters et al.

Similar to the work of Pacchierotti et al. speed, acceleration and passing

distances could be experimented with alongside the indicators or head movement

possibly using a stationary participant (Pacchierotti, et al., 2005). Part of this stems

from the results of the minimum distances kept, when using the indicators humans

kept further away from the robot, it would be interesting if they expect the robot to

also circumnavigate at a further distance to feel most comfortable whilst using

indicators, as well as the speed and acceleration used during the experiment.

This work could be mixed with that of Lichtenthäle et al. to look at navigational

signals during a crossover scenarior rather than a pass by. (Lichtenthäler, et al.,

2013). Similarly, if the experiment took place in a real world environment the factors

would be incredibly different and more natural. The experiment by Dondrup et al.

gave a scenario of the participants and the robot waiting tables and passing each

other in corridors (Dondrup, et al., 2014). Although the experiment was implemented

in a laboratory, either the principle of setting the participants a task, or using a real

world experiment in a restaurant could be used. These could be options to gain “real

world” experience and review the implantations in a more stressful set of

circumstances to see if the results are the same. A mixture of these two project

experiments could also be used, in terms of a real world path crossing experiment.

Finally, using principles from the second experiment by Dondrup et al. varying

distances could be used for the signals (Dondrup, et al., 2015). Although indicators

was favourable in terms of comfort in this experiment, it can’t be deduced that at

varying distances this would still be the preferred behaviour. Looking at varying

distances will allow a stronger conclusion to this experiment.

Page 49: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 42

5.6 Conclusion

The objectives for this project were:

1) Continue literature review to identify three methods of expressing

navigational intent that are legible, safe and effective, as well as formation of

hypotheses for later testing

The completion of this objective can be seen in Section 2, as well as the three

navigational signals spoken about during the course of the experiment: No Signal,

Indicate and Move Head.

2) Implement chosen methods using ROS in Python and testing for

reliability using either robot or a simulation environment

The source code for the methods can be seen in on the included CD with the

report or available on at: https://github.com/LCAS/navigation_intention.

3) Create and run an experiment to test human perception for each

implementation, with appropriate post-experiment survey for data gathering

The experiment was designed and run successfully, all aspects of the

experiment can be seen in Section 5. The survey forms can be found in Appendix

item a, b and c.

4) Use appropriate quantitative and qualitative methods to review data

acquired from the experiment and test against hypotheses

The data gathered was analysed, reviewed and tested against the

hypotheses, these result scan be seen in Sections 5.3.

The aim of this work was gauge how much HRI can be enhanced by

implementing navigational signals. This work implemented three method of

displaying navigation intent, to investigate how signals of navigational intent affect

the comfort felt by people around social robots as well as the speed and ease of

understanding these signals. The data from the experiments supplies strong

evidence to support navigational signals helping benefit HRSI. The results show

comprehensive evidence to support humans comfort, speed and ease of

understanding indicators as a mode of navigational intent compare to that of no

signal as well as minor positive differences between head movement over no signal.

Thus it would be argued that navigational signals can have a significant effect on

comfort felt for humans around mobile service robot.

Page 50: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 43

The work attempted to look at one of the key challenges in social robotics as

defined by Kruse et al. in their survey on Human-Aware Robot Navigation; Comfort

(Kruse, et al., 2013). The experiment found that of three implemented behaviours,

No Signal, Indicators and Move Head; Indicators were most preferable. As a side

consequence the study also found data regarding at another challenge defined by

Kruse et al.; Naturalness. Four of the ten participants that took part in the trails listed

the behaviour they wished to be made a standard for social robotics, due to it feeling

most “natural” to them.

6 Personal Reflection

At the start of this year I knew nothing about ROS and little about Python or

Ethical and Empirical research. I’ve become a more independent, self-directed and

self-motivated learner, however I’ve also learnt to ask for support if I need it. Finally,

although being an independent project, I’ve felt part of a team with my supervisor

Marc, Christian and all those doing projects as part of the STRANDS.

I was unsure what to do as my Masters Project and after speaking to Marc I

was fairly sure I wouldn’t have the skillset required to undertake this project.

However with his and Christian’s drive, enthusiasm, wisdom and support I feel more

confident than ever. The start of the project for myself was easiest, I know how to

write a proposal, and the theory and psychology behind what I was doing and why

was clear to me. Thus for this part I was happy to continue fairly independently

checking in with Marc and Christian ever two weeks at our meetings. Once the

practical part began, I carried on independently. This could have meant the failure

of this project; while the support and guidance I got during scheduled meetings was

invaluable, instead of asking questions and checking in in-between meetings I

carried on independently; I had questions I just waited until the meetings to ask.

Thankfully I became more confident in asking question and the fact that I wouldn’t

be able to continue the way I was. A key principle this work has shown me is that it

is okay to ask for help when needed and people are a lot more responsive to it rather

than waiting for weeks before asking.

Personally, I believe the project went well and I am proud of the work I

produced. I’ve shown to myself the ability to quickly pick up a new programming

Page 51: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 44

language, operating system and middleware, in the future I will be able to refer to

how flexible I am in project work using new systems thanks to this project. I believe

the implementation of the navigation signals are useful, novel and intuitive and they

led to an excellent overall display of the robot.

For myself the experiment was where I gained the most experience, I’ve

participated, but never run an experiment before. I got fantastic support from my

colleges on how to setup and run an ethical experiment. In my opinion the setup

was clear, concise, intuitive and interesting. I believe the participants were treated

ethically and respectfully, as well as being fully aware of their rights. Personally, I

have learnt the difficulty, stress, time and pressure it takes to setup and source

participants in an experiment. The experience I have gained from this will help me

if I chose to continue my interest in social robotics. I also feel honoured with the level

of trust shown in me, with being shown how to control the robot for the experiment,

and to record the data I needed, then being allowed to continue my experiments

unsupported.

In September, I had zero idea of what I wanted to do after this year of

University, whether I would get a graduate job, attempt to continue in academia or

go back to working with the disabled. This year has shown me a way I could do

aspects of these together. I’ve gained skills I never thought I would have and totally

enjoyed myself whilst doing it. I would like to finish by saying social robotics is fun.

Page 52: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 45

7 Bibliography

Automation and Control Institute, 2015. STRANDS. [Online]

Available at: http://strands.acin.tuwien.ac.at/

[Accessed 10 October 2014].

3diot, 2014. 3diot. [Online]

Available at: http://www.3diot.net/3-d-printed-companion/

[Accessed 7 Febuary 2015].

Althaus, P. et al., 2004. Navigation for human-robot interaction tasks. ICRA, IEEE.

Bennewitz, M. et al., 2005. Towards a humanoid museum guide robot that

interacts with multiple persons. Humanoid Robots, pp. 418-423.

Bittner, K., 2006. IBM developer works. [Online]

Available at: http://www.ibm.com/developerworks/rational/library/4029.html

[Accessed 20 March 2014].

Breazeal, C. et al., 2005. Effects of nonverbal communication on efficiency and

robustness in human-robot teamwork. Intelligent Robots and Systems, 1(1), pp.

708-713.

Byrant, S., 2014. Finacial Times. [Online]

Available at: http://www.ft.com/cms/s/0/4337b9a0-4d6b-11e4-bf60-

00144feab7de.html#axzz3GhTVCMyR

[Accessed 20 October 2014].

Cahoy, E. S., 2015. Pennsylvania State University. [Online]

Available at:

https://www.libraries.psu.edu/psul/researchguides/edupsych/empirical.html

[Accessed 15 January 2015].

Defence, M. o., 2014. DRAGON RUNNER bomb disposal robot. [Online]

Available at: http://www.army.mod.uk/equipment/23256.aspx

[Accessed 19 October 2014].

Page 53: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 46

Delaney, I., 2014. HERE 360. [Online]

Available at: http://360.here.com/2014/03/18/open-source-robotics-foundation/

[Accessed 21 January 2015].

Dondrup, C. et al., 2015. A Computational Model of Human-Robot Spatial

Interactions Based on a Qualitative Trajectory Calculus. Robotics, 4(1), pp. 63-

102.

Dondrup, C., Lichtenthäler, C. & & Hanheide, M., 2014. Hesitation signals in

human-robot head-on encounters: a pilot study. Proceedings of the 2014

ACM/IEEE international conference on Human-robot interaction, pp. 154-155.

Dudek, G. & Jenkin, M., 2010. Computational Principles of Mobile Robots

Cambridge, Cambridge: Cambridge University Press.

Duignan, B., 2015. Encyclopaedia Britannica. [Online]

Available at: http://www.britannica.com/EBchecked/topic/186146/empiricism

[Accessed 16 January 2015].

Explorable, 2015. Explorable. [Online]

Available at: https://explorable.com/empirical-research

[Accessed 26 January 2015].

Foote, T., 2013. ROS. [Online]

Available at: http://www.ros.org/news/2013/10/university-of-costra-rica-explores-

aerospace-research.html

[Accessed 22 January 2015].

G4S, 2014. G4S takes on a new recruit – Bob, the autonomous robot. [Online]

Available at:

http://www.g4s.com/en/Media%20Centre/News/2014/06/17/Bob%20the%20auton

omous%20robot/

[Accessed 10 October 2014].

Gaudin, S., 2014. Computer World. [Online]

Available at: ttp://www.computerworld.com/article/2835223/researchers-to-meet-

with-aid-workers-to-build-ebola-fighting-robots.html

[Accessed 11 October 2014].

Page 54: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 47

Git Hub, 2015. Git Hub. [Online]

Available at: https://github.com/

[Accessed 11 Nobember 2014].

Hall, J. et al., 2014. Perception of own and robot engagement in human–robot

interactions and their dependence on robotics knowledge. Robotics and

Autonomous Systems, 62(3), pp. 392-399.

Haus der Barmherzigkeit, 2014. APA-OTS. [Online]

Available at:

http://www.ots.at/presseaussendung/OTS_20140521_OTS0158/haus-der-

barmherzigkeit-entwickelt-roboter-der-aus-erfahrungen-lernt-bild

[Accessed 9 October 2014].

Hayashi, K., Shiomi, M., Kanda, T. & Hagita, a. N., 2011. Friendly patrolling: A

model of natural encounters. Los Angeles, CA, USA, Proceedings of Robotics:

Science and Systems.

Intermodalics, 2015. Intermodalics. [Online]

Available at: http://www.intermodalics.eu/services

[Accessed 21 January 2015].

International Federation of Robotics, 2014. International Federation of Robotics.

[Online]

Available at: http://www.ifr.org/service-robots/statistics/

[Accessed 28 March 2015].

Kroll, P., 2004. IBM devloper Works. [Online]

Available at: http://www.ibm.com/developerworks/rational/library/4243.html

[Accessed 17 March 2014].

Kruse, T., Basili, P., Glasauer, S. & Kirsch, A., 2012. Legible robot navigation in

the proximity of moving humans. Advanced Robotics and its Social Impacts

(ARSO), 1(1), pp. 83-88.

Kruse, T., Pandey, A. K., Alami, R. & & Kirsch, A., 2013. Human-aware robot

navigation: A survey. Robotics and Autonomous Systems, 61(12), pp. 1726-1743.

Page 55: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 48

Langevin, G., 2015. inMoov. [Online]

Available at: http://www.inmoov.fr/

[Accessed 8 Febuary 2015].

Lichtenthäler, C., Peters, A., Griffiths, S. & & Kirsch, A., 2013. Social navigation-

identifying robot navigation patterns in a path crossing scenario. Social Robotics,

pp. 84-93.

Lincoln Centre for Autonomous Systems, 2015. Lincoln Centre for Autonomous

Systems. [Online]

Available at: http://robots.lincoln.ac.uk/linda/

[Accessed 5 Febuary 2015].

MetraLabs, 2015. MetraLabs. [Online]

Available at:

http://metralabs.com/index.php?option=com_content&view=article&id=70&Itemid=

64

[Accessed 21 March 2015].

Mori, M., MacDorman, K. F. & & Kageki, N., 2012. The uncanny valley [from the

field]. Robotics & Automation Magazine, IEEE, 19(2), pp. 98-100.

Mutlu, B. T., Kanda, T., Ishiguro, H. & & Hagita, N., 2009. Footing in human-robot

conversations: how robots might shape participant roles using gaze cues. s.l.,

ACM/IEEE.

Pacchierotti, E., Christensen, H. I. & & Jensfelt, P., 2005. Human-robot embodied

interaction in hallway settings: a pilot user study. Robot and Human Interactive

Communication, pp. 164-171.

Peters, A., Spexard, T. P., Weiß, P. & Hanheide, M., 2011. Hey robot, get out of

my way. Behaviour Monitoring and Interpretation. Well-Being.

Project Lifecycle Services Ltd, 2014. Project Lifecycle Services Ltd. [Online]

Available at: http://www.projectlifecycleservicesltd.co.uk/project-management-

services/scrum-project-management.php

[Accessed 7 October 2014].

Page 56: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 49

Robotdom, 2015. Robotdom. [Online]

Available at: http://www.robotdom.com/rovio.html

[Accessed 6 Febuary 2015].

ROS Industrial, 2015. ROS Industrial. [Online]

Available at: http://rosindustrial.org/about/faq/

[Accessed 7 Febuary 2015].

ROS, 2015. Wiki ROS. [Online]

Available at: http://wiki.ros.org/indigo

[Accessed 30 September 2014].

Satake, S. et al., 2009. How to approach humans?-strategies for social robots to

initiate interaction. Human-Robot Interaction (HRI), pp. 109-116.

Saulnier, P., Sharlin, E. & Greenberg, a. S., 2011. Exploring minimal nonverbal

interruption in social hri. RO-MAN, IEEE.

Steinfeld, A. et al., 2006. Common metrics for human-robot interaction.. Salt Lake

CIty, UT, USA, In Procceding of the 1st ACM SIGCHI/SIGART Conference on

Human-Robot Interaction-HRI.

Svenstrup, M., Tranberg, S., Andersen, H. J. & & Bak, T., 2009. Pose estimation

and adaptive robot behaviour for human-robot interaction. Robotics and

Automation, pp. 3571-3576.

Torta, E., Cuijpers, R. H., Juola, J. F. & van der Pol, D., 2011. Design of robust

robotic proxemic behaviour. Social Robotics, pp. 21-30.

Tranberg Hansen, S., Svenstrup, M., Andersen, H. J. & & Bak, T., 2009. Adaptive

human aware navigation based on motion pattern analysis. Robot and Human

Interactive Communication, pp. 927-932.

Turtlebot, 2015. Turtlebot. [Online]

Available at: http://turtlebot.com/

[Accessed 7 FEbuary 2015].

Unmanned Vechicle Centre, 2015. Department of Mechanics, Royal Military

Academy of Belgium. [Online]

Page 57: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 50

Available at: http://mecatron.rma.ac.be/Robots/Pioneer-AT-Robot.html

[Accessed 8 Febuary 2015].

VDMA, 2013. World Robotics. [Online]

Available at:

http://www.worldrobotics.org/uploads/tx_zeifr/Charts_IFR__30_09_2014_01.pdf

[Accessed 20 March 2014].

Walters, M. L., Oskoei, M. A., Syrdal, D. S. & Dautenhahn, K., 2011. A long-term

human-robot proxemic study. RO-MAN, pp. 137-142.

Willow Garage, 2015. Willow Garage. [Online]

Available at: https://www.willowgarage.com/pages/software/ros-platform

[Accessed 19 January 2015].

WowWee, 2015. WowWee. [Online]

Available at: http://www.wowwee.com/en/products/tech/telepresence/rovio/rovio

[Accessed 10 Febuary 2015].

Young, J. et al., 2011. How to walk a robot: A dog-leash human-robot. RO-MAN,

IEEE.

Page 58: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 51

8 Appendices

a) Participant Consent Form

Date: ___________ Participants Name: ______________ I consent to participating in Alyxander D May’s, Sinjun Strydom’s and Piotr Psuty’s respected projects and I acknowledge the following:

I understand that if I feel uncomfortable during the tests I can stop at any time

I understand that I can leave at any time during any of the tests and the data that was collected up to that point will be deleted and not used.

I understand that I can withdraw from the test even after my completion of the test as long as I inform the conductor of the test of my withdrawal within 2 weeks of my participation.

I’ve been verbally told about the tests, their purpose and what I am expected to do in them.

I am happy for Alyxander, Sinjun and Piotr to use the data that is collected from my participation in the tests to be used in there project evaluation.

I understand that personal data collected from my participation will be anonymized

I understand that if I wish, I can ask about the data collected from my participation after the tests and that if I want to then see that data I would be allowed to.

I understand that the data collected from my participation will be deleted after a year, so if I want to request this data from the conductor it would have to be within that timeframe

I understand that I can ask questions before and after each test but not during

I understand that I will be recorded while I do the tests but I understand that the recordings will only be used for the project evaluation and will be destroyed once the project has been completed

With me signing this consent form I agree that I have read, understood and agree to the above points. Participant Signature: ……………………………………………………… Alyxander D May’s signature: ………………………………………….. Sinjun Strydom’s signature: …………………………………………….. Piotr Psuty’s signature: …………………………………………………….

Page 59: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 52

b) Participant Demographic Form

c) Survey Response Form

Robot Navigation Intent Survey

First and foremost, thank you for participating in this study. You have just completed a set

of seven tests using the Scitos G5 robot, the study is looking at human responses to various

navigation signals from a robot. Below is a short survey about the test.

Thank you for your time.

The questions in the first three sections relate to the behaviour listed at the top of the

section.

For the first three sections please circle a number as your answer.

1) No signal, just movement

a) I was able to understand the intention of the robot when using this behaviour.

Strongly Disagree Disagree NeutralAgree Strongly Agree

1 2 3 4 5

Page 60: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 53

b) I felt comfortable circumnavigating the robot while it was exhibiting this behaviour.

Strongly Disagree Disagree NeutralAgree Strongly Agree

1 2 3 4 5

c) I was quickly able to understand the intention of the robot.

Strongly Disagree Disagree NeutralAgree Strongly Agree

1 2 3 4 5

2) Indicators

a) I was able to understand the intention of the robot when using this behaviour.

Strongly Disagree Disagree NeutralAgree Strongly Agree

1 2 3 4 5

b) I felt comfortable circumnavigating the robot while it was exhibiting this behaviour.

Strongly Disagree Disagree NeutralAgree Strongly Agree

1 2 3 4 5

c) I was quickly able to understand the intention of the robot.

Strongly Disagree Disagree NeutralAgree Strongly Agree

1 2 3 4 5

3) Head Movement

a) I was able to understand the intention of the robot when using this behaviour.

Strongly Disagree Disagree NeutralAgree Strongly Agree

1 2 3 4 5

b) I felt comfortable circumnavigating the robot while it was exhibiting this behaviour.

Strongly Disagree Disagree NeutralAgree Strongly Agree

1 2 3 4 5

c) I was quickly able to understand the intention of the robot.

Page 61: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 54

Strongly Disagree Disagree NeutralAgree Strongly Agree

1 2 3 4 5

4) General

a) Of the three behaviours, if one were to become a convention today, which would you

pick and why?

b) Is there a way you would PREFER a robot to signal navigational intention other than

those used in the test? If so, please state what the signal would be and a brief statement as

why you would prefer it.

d) Goals and Time Frames

Page 62: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 55

Tasks Start Date End Date

Literature Review 24/10/2014 14/11/2014

Implementation Choosing 24/10/2014 14/11/2014

Hypotheses Creation 31/10/2014 05/12/2014

Implementation Development 07/11/2014 19/12/2014

Implementation Testing 21/11/2014 09/01/2015

Experiment Creation 19/12/2014 09/01/2015

Consent Form Creation 02/01/2015 09/01/2015

Experiment 09/01/2015 23/01/2015

Quantitative Analysis 23/01/2015 20/02/2015

Qualitative Analysis 06/02/2015 26/02/2015

Hypotheses Testing 20/02/2015 06/03/2015

Report Writing 21/11/2014 20/03/2015

Page 63: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 56

e) Gantt Chart

Page 64: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 57

f) Risk Assessment and Contingency Plans

Risk Severity Likelihood Contingenc

y Plan

The robot being

unavailable for

development

Low Medium/High A simulation

environment is

available for the

robot that works

with ROS and

Ubuntu that is

always available

Methods not being

finished in time for

the experiment

High Low A Gantt chart will

be in place to

follow. The date of

the experiment will

be known and thus

it will know when

implementations

must be finished, if

they are not they

will be used in

current state

Not enough

participants for the

experiment

Medium Medium Ensuring the

experiment is

properly arranged,

organized and

participants are

informed of where

and when they are

supposed to be, as

well as inviting

more participants

Page 65: Alyxander May MAY11213081 MComp Project

Alyxander David May

Page | 58

than needed

Robot not working

correctly for the

experiment

High Low There are four

robots in the

United Kingdom

the same as

LINDA, thus if she

was unavailable

BOB or another

robot would be

used instead

Not following of

the Gantt chart or

not completing

milestones

Medium/High Medium By not over aiming

in the Gantt chart

and

overprovisioning in

terms of the end of

the project so that

there is time to

finish anything

before the deadline

Underestimation of

the scope and

feasibility of the

project

Medium Low By having regular

meetings with the

project supervisor

and listening to

their feedback and

thoughts, this

should help to

stem any issue

from being over

ambitious