wiki.cis.unisa.edu.au€¦  · web viewthis dissertation presents a study of the appropriate...

201
Visual Cues for the Instructed Arrangement of Physical Objects Using Spatial Augmented Reality (SAR) JESSICA A. TSIMERIS A thesis submitted for the degree of Bachelor of Computer Science (Honours) School of Computer and Information Science | University of South Australia Supervisor: Professor Bruce H. Thomas | [email protected] November 2010

Upload: vuongxuyen

Post on 28-Jun-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Visual Cues for the Instructed Arrangement of Physical Objects Using Spatial Augmented Reality (SAR)JESSICA A. TSIMERIS

A thesis submitted for the degree of Bachelor of Computer Science (Honours)

School of Computer and Information Science | University of South Australia

Supervisor: Professor Bruce H. Thomas | [email protected]

November 2010

Page 2: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Disclaimer

I declare that:

this thesis presents work carried out by myself and does not incorporate without

acknowledgment any material previously submitted for a degree or diploma in any

university, and

to the best of my knowledge this thesis does not contain any materials previously published

or written by another person except where due reference is made in the text; and all

substantive contributions by others to the work presented, including jointly authored

publications, is clearly acknowledged.

Adelaide, October 2010.

Jessica Anne Tsimeris

i

Page 3: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Acknowledgements

My honours year has been incredibly challenging but the fulfilment that I have felt this year has

been amazing. I know that I could not have completed this alone.

I would like to thank Bruce Hunter Thomas and the members of the Wearable Computer Lab at the

University of South Australia for helping me with new concepts, giving me timely advice, and

providing a welcomed distraction at times.

I would also like to thank my friends and family, for their unrelenting support that has continued

for the duration of my degree.

ii

Page 4: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Abstract

This dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange

physical objects using Spatial Augmented Reality. Spatial Augmented Reality (SAR) systems

enhance a user’s physical environment via the integration of virtual objects into the physical

environment using projectors. SAR differs from traditional Augmented Reality (AR) systems

which typically enhance a user’s environment by utilising a head mounted device or by using a

mobile display. The SAR visual cues developed overcome a number of the current limitations

found in the area of object manipulation using AR and SAR systems; a major limitation is that

object manipulation using AR and SAR has been largely focused on virtual objects. The research

presented in this dissertation addresses this limitation as it provides a method for directly arranging

physical objects. In order to instruct a user on how to arrange physical objects using SAR, two

instructions are defined to arrange a physical object to a position; rotation and translation. A set of

SAR visual cues have been developed which utilise these two instructions or variations of them. A

top down arrangement application has been developed in which the user arranges representations of

physical objects from a top down view and the resulting arrangement can then be performed on the

physical objects using SAR visual cues. This process can be undertaken locally or remotely, in real

time by one user or as a collaboration task and the process can also be stored to provide

instructions on demand at a later date.

A user study was undertaken to determine the effectiveness of these SAR visual cues when

compared to a manual arrangement method and when compared to one another. The results of this

user study determined that SAR visual cues have been developed that are faster and more accurate

than a manual object arrangement method. A revised visual cue has been developed by taking into

account the results of the user study, and a pilot study regarding this new visual cue has indicated

speed and accuracy improvements. Furthermore, the SAR visual cues developed have been shown

to facilitate the arrangement of physical objects of a variety of sizes, including large objects. Using

SAR in this way could be useful for the arrangement of large objects, such as furniture or theatre

sets, and when multiple objects are being arranged. The SAR visual cues in isolation are not a

unique innovation, but in conjunction with the context in which the cues are being used – to

instruct users on how to arrange a physical object using SAR, a unique approach is presented.

iii

Page 5: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Table of Contents

1 Introduction...............................................................................................................................1

1.1 Motivation.......................................................................................................................2

1.2 Research Question...........................................................................................................4

1.3 Scope...............................................................................................................................4

1.4 Contribution.....................................................................................................................5

1.5 Research Structure...........................................................................................................5

1.6 Dissertation Structure......................................................................................................7

2 Background...............................................................................................................................9

2.1 Classification.................................................................................................................10

2.2 Benefits..........................................................................................................................11

2.3 Object Manipulation......................................................................................................13

2.4 User Interaction.............................................................................................................15

2.5 Tracking.........................................................................................................................17

3 SAR Object Arrangement Techniques.................................................................................19

3.1 Interaction......................................................................................................................19

3.2 Object Arrangement Techniques...................................................................................20

3.2.1 Translation Instruction...........................................................................................21

3.2.2 Rotation Instruction...............................................................................................23

3.2.3 SAR Visual Cues...................................................................................................25

3.2.3.1 Translation First Cue.........................................................................................27

3.2.3.2 Rotation First Cue.............................................................................................30

3.2.3.3 Both Instructions Cue........................................................................................30

3.2.3.4 Circle Cue.........................................................................................................32

3.2.3.5 Wireframe Cue..................................................................................................35

3.2.3.6 Square Cue........................................................................................................38

3.2.4 Discussion..............................................................................................................42

iv

Page 6: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

3.2.4.1 Projection upon Tracking Markers...................................................................42

3.2.4.2 Distortion..........................................................................................................43

4 User Study...............................................................................................................................46

4.1 Configuration.................................................................................................................46

4.2 Tasks Undertaken for the User Study............................................................................47

4.3 Structure........................................................................................................................50

4.4 Data Gathering...............................................................................................................50

4.4.1 Quantitative Data Gathering..................................................................................50

4.4.2 Qualitative Data Gathering....................................................................................51

4.4.3 Manual Timer and SAR Timer Data Gathering....................................................52

5 Results......................................................................................................................................54

5.1 User Information...........................................................................................................54

5.2 Quantitative Data...........................................................................................................54

5.2.1 Task Completion Times........................................................................................55

5.2.2 Accuracy................................................................................................................56

5.3 Qualitative Data.............................................................................................................59

5.3.1 Intuitiveness Survey..............................................................................................59

5.3.2 User Opinion Survey.............................................................................................64

5.4 Discussion......................................................................................................................69

5.4.1 Notes on the Manual Timer and SAR Timer Results............................................70

6 Revised Visual Cue.................................................................................................................72

6.1 Analysis.........................................................................................................................72

6.2 Methodology and Implementation................................................................................73

6.3 Discussion......................................................................................................................77

7 Pilot Study...............................................................................................................................79

7.1 Structure........................................................................................................................79

7.2 Preliminary Results.......................................................................................................80

7.2.1 User Information...................................................................................................80v

Page 7: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

7.2.2 Quantitative Data...................................................................................................80

7.2.2.1 Task Completion Times....................................................................................81

7.2.2.2 Accuracy...........................................................................................................82

7.2.3 Qualitative Data.....................................................................................................84

7.2.3.1 Intuitiveness Survey..........................................................................................84

7.2.3.2 User Opinion Survey.........................................................................................86

7.2.4 Discussion..............................................................................................................86

8 Large Object Arrangement....................................................................................................87

9 Implementation.......................................................................................................................95

9.1 Overall Structure...........................................................................................................95

9.2 Existing Components.....................................................................................................96

9.2.1 SAR System...........................................................................................................96

9.2.2 Tracking.................................................................................................................97

9.2.2.1 Infrared Tracking..............................................................................................98

9.3 Developed Components.................................................................................................98

9.3.1 SAR Module..........................................................................................................98

9.3.2 Top Down Arrangement Application..................................................................100

10 Conclusion.............................................................................................................................102

10.1 Future Work.................................................................................................................103

11 References..............................................................................................................................105

Appendix A User Study Survey.............................................................................................111

Appendix B Pilot Study Survey.............................................................................................117

Appendix C Data Tables for Included Results.....................................................................120

vi

Page 8: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Table of Figures

Figure 1 Research Structure................................................................................................................6

Figure 2 A Vase (Left), and the Vase Being Augmented by Projectors (Right) (Raskar, R et al.

2001)...............................................................................................................................10

Figure 3 The Reality-Virtuality Continuum (Drascic & Milgram 1996)..........................................11

Figure 4 Virtual Object Manipulation Using Physical Objects (Fjeld 1999)....................................14

Figure 5 The metaDESK System (Ullmer & Ishii 1997)..................................................................16

Figure 6 Object Arrangement Process...............................................................................................21

Figure 7 Translation Instruction: Starting Position...........................................................................22

Figure 8 Translation Instruction: Position After Part of the Translation has been Completed.........22

Figure 9 Translation Instruction: Final Position................................................................................23

Figure 10 Rotation Instruction: Starting Position..............................................................................24

Figure 11 Rotation Instruction: Position After Part of the Rotation has been Completed................24

Figure 12 Rotation Instruction: Final Position..................................................................................25

Figure 13 Translation First Cue: Translation (the Initial State)........................................................27

Figure 14 Translation Cue: Translation Implementation (the Initial State)......................................28

Figure 15 Translation First Cue: Rotation.........................................................................................29

Figure 16 Translation First Cue: Rotation Implementation..............................................................29

Figure 17 Both Instructions Cue: Scenario of Translation Performed First......................................31

Figure 18 Both Instructions Cue: Scenario of Rotation Performed First..........................................31

Figure 19 Both Instructions Cue: Implementation............................................................................32

Figure 20 Circle Cue: Translation (the Initial State).........................................................................33

Figure 21 Circle Cue: Translation Implementation (the Initial State)...............................................33

Figure 22 Circle Cue: Rotation..........................................................................................................34

Figure 23 Circle Cue: Rotation Implementation...............................................................................35

Figure 24 Wireframe Cue: Translation Implementation (the Initial State).......................................36

Figure 25 Wireframe Cue: Translation (the Initial State).................................................................37

Figure 26 Wireframe Cue: Rotation..................................................................................................37

Figure 27 Wireframe Cue: Rotation Implementation........................................................................38

Figure 28 Square Cue: Translation (the Initial State)........................................................................39

Figure 29 Square Cue: Translation Implementation (the Initial State).............................................40

Figure 30 Square Cue: Rotation........................................................................................................41

Figure 31 Square Cue: Rotation Implementation..............................................................................41

Figure 32 SAR Visual Cue Distortion on the Side of the Physical Object.......................................43

Figure 33 SAR Visual Cue Distortion from the Projector Position..................................................44vii

Page 9: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 34 User Study Setup...............................................................................................................47

Figure 35 Projected Grid Utilised in Task 1......................................................................................48

Figure 36 The Grid used for Task 1 (left) and a Participant Undertaking Task 1 (right).................48

Figure 37 Average Completion Time for all User Study Tasks as Timed with the Manual Timer

and the SAR Timer.........................................................................................................55

Figure 38 Average x and y Axis Variation for all User Study Tasks................................................56

Figure 39 Average Rotation Variation of all User Study Tasks........................................................58

Figure 40 Translation Instruction Intuitiveness Diagram..................................................................59

Figure 41 Results of the Intuitiveness Survey Regarding the Translation Instruction......................60

Figure 42 Rotation Instruction Intuitiveness Diagram......................................................................60

Figure 43 Results of the Intuitiveness Survey Regarding the Rotation Instruction..........................61

Figure 44 Circle Cue Translation Instruction Intuitiveness Diagram...............................................62

Figure 45 Results of the Intuitiveness Survey Regarding the Circle Cue Translation......................62

Figure 46 Circle Cue Rotation Instruction Intuitiveness Diagram....................................................63

Figure 47 Results of the Intuitiveness Survey Regarding the Circle Cue Rotation..........................64

Figure 48 User Preference Between a Manual Object Arrangement and an Object Arrangement

Using SAR......................................................................................................................65

Figure 49 User Preference Between Translation First, Rotation First and Both Instructions Cue. . .65

Figure 50 User Preference between Translation First and Circle Cue..............................................66

Figure 51 User Preference Between Wireframe Cue and Wireframe Cue with No Destination......67

Figure 52 User Preference Between Square Cue (Task 8) and Square Cue with No Destination

(Task 9)...........................................................................................................................67

Figure 53 User Preference Between Square Cues (Task 8 and Task 9) and Wireframe Cues (Task 6

and Task 7)......................................................................................................................68

Figure 54 User Preference of Visual Cues that Utilise the Translation and Rotation Instructions...69

Figure 55 Revised Visual Cue: Translation.......................................................................................74

Figure 56 Revised Visual Cue: Translation Implementation............................................................74

Figure 57 Revised Visual Cue: Translation.......................................................................................75

Figure 58 Revised Visual Cue: Rotation...........................................................................................76

Figure 59 Revised Visual Cue: Rotation Implementation.................................................................76

Figure 60 Revised Visual Cue: Rotation...........................................................................................77

Figure 61 Average Completion Time of all User Study Tasks and the Revised Visual Cue (Manual

Timer and SAR Timer)...................................................................................................81

Figure 62 Average x Axis and y Axis Variation of all User Study Tasks and the Revised Visual

Cue..................................................................................................................................82

Figure 63 Average Rotation Variation of all User Study Tasks and the Revised Visual Cue..........83

viii

Page 10: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 64 Revised Visual Cue Translation Instruction Intuitiveness Diagram.................................85

Figure 65 Revised Visual Cue Rotation Instruction Intuitiveness Diagram.....................................85

Figure 66 SAR Visual Cue Distortion...............................................................................................88

Figure 67 Large Object Arrangement: Initial Position......................................................................90

Figure 68 Large Object Arrangement: After a Translation...............................................................91

Figure 69 Large Object Arrangement: Translation Complete...........................................................92

Figure 70 Large Object Arrangement: Rotation................................................................................93

Figure 71 Large Object Arrangement: Successfully Arranged.........................................................94

Figure 72 Overall Implemented System Structure............................................................................95

Figure 73 Top Down Arrangement Application Screenshot...........................................................100

ix

Page 11: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

1 Introduction

Azuma et al. (2001) describe Augmented Reality by defining an AR system as being a system that

enhances the user’s real world with virtual objects that appear to exist in the real world. According

to Azuma et al. (2001), AR systems:

blend real and virtual objects in a real environment,

are real time interactive, and

involve three dimensional registration [thus, the real and virtual objects are aligned, which

can be achieved through tracking] (Azuma et al. 2001, p.1).

Raskar et al. (2005) differentiate between Virtual Reality (VR) and AR by stating that in AR, the

user’s real world environment is not suppressed as it traditionally is in VR. Spatial Augmented

Reality (SAR) (Bimber & Raskar 2005) systems enhance a user’s physical environment via the

integration of virtual objects into the physical environment using projectors (Raskar, R et al. 2001).

SAR differs from traditional Augmented Reality (AR) (Azuma, RT 1997; Azuma, R et al. 2001)

systems which typically enhance a user’s environment by utilising a head mounted device or by

using a mobile display. SAR has been used in the past for object illumination (Raskar, R et al.

2001) and AR can be used for annotating objects (Feiner, Macintyre & Seligmann 1993). The

research described in this dissertation utilises a combination of both to produce SAR visual cues to

instruct a user on how to arrange tracked physical objects.

To further assist the arrangement process, a top down arrangement application has been developed

from which SAR visual cues for physical object manipulation can be generated. The user arranges

representations of physical objects, and the resulting arrangement is performed on the physical

objects using SAR visual cues.

As a means of illustrating the concept of this research, the following example is provided. A

teacher would like to arrange several furniture objects within a classroom; desks, whiteboards and

chairs. The SAR visual cues described within this dissertation can aid with this task. The teacher

devises an appropriate layout using representations of these objects within a top down arrangement 1

Page 12: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

application running on a computer. Upon completion, the teacher, with the help of their students, is

instructed to arrange each of the objects to their correct positions according to the arrangement

specified in the top down arrangement application. This is completed using SAR visual cues, which

instruct the user to arrange an object to its correct position using instructions for translation and

rotation. At any time, the teacher may alter the arrangement in the top down arrangement

application and the changes will be incorporated into the current SAR object arrangement process.

The teacher may choose to store this configuration and access it when a specific class is being

undertaken. It is also possible for another staff member to utilise the top down arrangement

application from a remote location, and have the teacher undertake the arrangement task according

to the SAR visual cues generated.

This dissertation presents both an exploration of appropriate visual SAR cues that guide users in

the arrangement tasks of physical objects, and a detailed analysis of the effectiveness of these SAR

visual cues. Two instructions are needed to arrange a physical object to a position; rotation and

translation. The SAR visual cues developed utilise these instructions or variations of these

instructions.

The SAR visual cues in isolation are not a unique innovation, but in conjunction with the context in

which the cues are being used – to instruct users on how to arrange a physical object using SAR, a

unique approach is presented. The SAR visual cues described in this dissertation allow for the

direct manipulation of physical objects in a manner which is faster and more accurate when

compared to a manual method of physical object arrangement.

1.1 MotivationThe motivation in selecting this topic was sparked by limitations in the area of object arrangement

using SAR. The SAR visual cues developed overcome a number of the current limitations found in

the area of physical object manipulation using AR and SAR systems. A major limitation is that

object manipulation using AR and SAR has been largely focused on virtual objects. As investigated

in Section 2, AR has been used to enable the manipulation of virtual objects in physical

environments (Webster et al. 1996; MacIntyre et al. 2004; Wang & Gong 2007) and the

manipulation of virtual objects in virtual environments (Kato, H, Billinghurst, M, Poupyrev, K,

Imamoto, K & Tachibana, K 2000; Irawati 2006b). SAR systems have been developed that allow

for virtual objects to be selected, positioned, rotated and fixed in accordance to the user’s

manipulation of a corresponding physical object (Fjeld 1999). Thus, previous work is largely

focused on the manipulation and arrangement of virtual objects rather than physical objects.

2

Page 13: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

There are many limitations with virtual object arrangement when compared with physical object

arrangement. There may be an additional task of applying the virtual arrangement to corresponding

physical objects. Achieving a virtual arrangement and then applying this arrangement to physical

objects is likely to be more time consuming and error prone than directly arranging physical

objects. For example, a user would like to arrange furniture in a room of their home and do so by

utilising an AR implementation that allows for the arrangement of virtual representations of their

furniture in a virtual room (Irawati 2006b). A limitation of this is that once the user devises the

arrangement with the virtual representations of the furniture, the user will have to arrange the

corresponding physical furniture items. This process is completely avoided if a direct manipulation

of the physical objects is undertaken. This process is likely to be error prone, as the user is offered

little guidance to arrange the physical objects in their correct locations: furniture will have to be

arranged using some estimation of their positions. The user may obtain a printed layout of the

locations of the virtual objects that they have arranged. This printed layout is then difficult to

change if the user changes their mind regarding where they would like the furniture. Alternatively,

the AR system may provide some measurements to instruct the user on how to arrange the furniture

pieces that corresponds with a virtual object, but this is also time consuming and not ideal.

The research presented in this dissertation addresses this limitation as it provides a method for

directly arranging physical objects with SAR assistance. When considering the same example and

utilising the system described in this dissertation, the user is able to interact with top down

representations of the physical objects, which can be arranged to achieve a suitable layout. The user

is then instructed on how to arrange each physical object with the use of SAR visual cues. The

physical objects can then be placed in their correct positions more accurately as this research

removes the need for estimation, and thus is less error prone. Using the system described in this

dissertation is also less time consuming as there is no estimation and calculation to be undertaken

by the user; the user must simply follow instructions from the SAR visual cues. The top down

arrangement application can be updated if the user changes their mind regarding where they would

like the furniture. Altering the top down arrangement application is then immediately reflected by

the SAR visual cues, which are updated dynamically. The lack of consideration for these

limitations was an important factor in deciding to undertake the research described in this

dissertation.

SAR visual cues allow physical objects to be arranged by giving a user several instructions to

arrange a physical object to a position specified. This is useful when:

1. The alignment of a physical object to a predetermined position is required.

2. There is a lack of physical cues to help position a physical object.

3

Page 14: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

In these cases, having instructions for the physical object arrangement systematically displayed to

the user, and having each instruction verified upon successful completion may assist the user.

Utilising SAR in the way presented in this dissertation could be useful for the arrangement of large

objects, such as furniture or theatre sets, or in situations where multiple objects are being arranged.

There could also be applications for this research within the areas of games and edutainment. It has

also been identified that a SAR implementation such as this could be useful for the remote

arrangement of real objects.

1.2 Research QuestionThe question that this research will answer is:

What are the appropriate Spatial Augmented Reality visual cues to instruct a user on how

to arrange real objects?

1.3 Scope

This research is focused on the arrangement of physical objects upon a ground plane. Thus,

translation is bound to the x axis and y axis. Similarly, rotation around the z axis is addressed.

Future work can be undertaken to allow for more arrangement options. For example, a vertical

translation along the z axis would allow physical objects to be stacked and the ability to perform

rotations around the x axis and y axis would allow physical objects to be flipped.

The implementation produced allows for multiple physical objects to be arranged if so desired, but

the order in which the physical objects are arranged is not taken into consideration. For example,

when arranging multiple physical objects, the user may be presented with a situation where the

object to be arranged must be translated to a position that intersects other physical objects, making

the instruction difficult to perform. More work can be undertaken which addresses this. For

example, a technique may be implemented which generates a path for each object to be arranged,

and discerns the order in which the objects should be arranged.

As AR systems involve 3D registration (Azuma, R et al. 2001), this research employs the use of a

tracking framework to provide this. As the focus of this research was on the effectiveness of SAR

visual cues for physical object arrangement, an existing tracking framework was utilised. This

research makes use of the ARToolKitPlus tracking framework (Wagner & Schmalstieg 2007).

4

Page 15: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

1.4 Contribution

This dissertation makes several contributions to the area of object arrangement using AR.

Contributions described in this dissertation include:

a survey of existing technologies,

SAR instructions for rotation and translation that can be used in conjunction to arrange a

physical object,

a set of validated and implemented SAR visual cues that utilise the developed instructions

or versions of these instructions,

the integration of the SAR visual cues to form a system that can instruct a user to arrange

physical objects,

a two-dimensional application that has been incorporated into the working system which

aids the physical object arrangement process,

an evaluation of the effectiveness of the SAR visual cues developed,

a revised SAR visual cue that considers the results of the user study, and

an adaptation of a SAR visual cue which can arrange large physical objects.

1.5 Research Structure

The research undertaken resembles, but does not strictly adhere to, the traditional waterfall

software development process. Figure 1 depicts a diagram of the structure of the research

undertaken.

5

Page 16: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 1 Research Structure

The first component of the research undertaken was an analysis phase. As part of the analysis

phase, a literature review was undertaken which provided an overview of previous work that is

related to the research at hand. This literature review is presented in Section 2. The information

gathered from the literature review uncovered gaps in the research in the area and helped form the

research questions being addressed by this dissertation. The research question formed is presented

in Section 1.2. The literature review, devised research questions, a proposed methodology and

other analytical components were summated to produce a proposal of the research to be

undertaken.

After the analysis was complete, development of the system began and further research was

undertaken. This development and further research consisted of the tasks of designing the system

and its components, implementation of these designs and an evaluation of the system produced.

This cycle of tasks was repeated over four iterations. Upon completion of these iterations, a user

study was undertaken and then the iteration cycle was then performed another two times to produce

an improved SAR visual cue that incorporated the data gathered from the user study. Once the

improved SAR visual cue was developed, the iteration cycle was completed a further two times to

produce an adapted SAR visual cue to arrange large objects. Once the system development and

research had been undertaken, the results were written up and compiled to form this dissertation.

6

Page 17: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

1.6 Dissertation Structure

This chapter provides an introduction to the research undertaken. This is achieved by a statement of

the motivation for the research, a definition of the research question that has been addressed, an

outline of the scope of the research and a description of the structure of the research undertaken.

Chapter 2 presents a study of the relevant previous work undertaken in the area. This serves to

define the context and provide background information on the research undertaken which supports

the validity of the topic chosen. This is achieved by presenting a survey of research that has been

previously undertaken within the areas of AR and SAR. This survey classifies the research and

states the benefits of the utilisation of SAR in this manner. The study also provides examples of

previous AR and SAR research undertaken in relation to object manipulation, user interaction and

tracking methods.

Chapter 3 presents the SAR object arrangement techniques that have been developed. To facilitate

the arrangement of physical objects using SAR, techniques for user interaction with the developed

system have been devised and are presented. Also presented are a description of the translation and

rotation instructions developed and a discussion of how they can be used in conjunction with other

augmentations to develop a set of SAR visual cues. Furthermore, The SAR visual cues developed

are presented, with descriptions and advantages of each. This is followed by a discussion of the

SAR visual cues which serves to compare the visual cues and address implementation issues.

Chapter 4 describes a user study that was undertaken to determine the effectiveness of the SAR

visual cues developed. This description includes a definition of the tasks undertaken, the structure

in which they were undertaken, and the data that was gathered.

Chapter 5 presents the results of the user study described. These results include quantitative data in

the form of manual timer results, SAR timer results and accuracy results, and qualitative data in the

form of intuitiveness survey results and user opinion survey results. This is followed by a

discussion which serves to clarify the results and concludes with a recommendation on the most

appropriate SAR visual cues.

Chapter 6 describes a revised SAR visual cue that was developed whilst taking into account the

results of the user study. Presented is the methodology undertaken to develop this new SAR visual

cue and a discussion of the new SAR visual cue.

Chapter 7 presents details of the pilot study undertaken to gather preliminary results on the

effectiveness of the Revised Visual Cue developed. The pilot study resembles the user study

undertaken. Also presented are the preliminary results gathered. The preliminary results are

7

Page 18: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

compared with the user study data gathered, to give an indication of the effectiveness of the

Revised Visual Cue.

Chapter 8 provides a discussion of a scenario in which a modified SAR visual cue is utilised for the

instructed arrangement of a large object, as this scenario varies from the scenarios previously

described in this dissertation.

Chapter 9 describes the implementation of the overall system, providing a context of the developed

and existing components utilised to implement the functioning system. Details of the existing

components utilised are provided, which include a SAR system and a tracking framework. Details

are also provided of new components developed, which include a module for the existing SAR

system and a top down arrangement application that aids the object arrangement process.

Finally, Chapter 10 provides a conclusion which serves as a summary and discussion of the

research presented in this dissertation. Suggestions for future work that can be undertaken

regarding this research are also presented.

8

Page 19: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

2 Background

This section of the dissertation serves to define the context and provide background information of

the research described in this dissertation. This is achieved by presenting a survey of research that

has been previously undertaken within the area of the research topic. The survey encompasses both

AR and SAR applications as the roots of AR and SAR are related (Bimber & Raskar 2005).

A background of the areas of AR and SAR are presented. This includes classification and benefits

of utilising SAR in this manner. Also presented are studies of the specific areas of AR and SAR

research; object arrangement, user interaction and tracking methods. These specific areas are

presented in Sections 2.3, Section 2.4 and Section 2.5 respectively.

The research described within this dissertation is an application of SAR. In order to define the

context of the research, related areas are addressed. Augmented Reality systems enhance the real

world with virtual objects (Azuma, R et al. 2001). To the user, these virtual objects appear to co-

exist with physical objects. AR typically utilises head mounted displays to enable the user to view

the virtual enhancements (Voida et al. 2005). Thus, the creation of the first head mounted display

(Sutherland 1968) has been considered as the birth of Augmented Reality (Bimber & Raskar 2005).

Raskar et al. (1998) introduced the paradigm of Spatial Augmented Reality (SAR) over a decade

ago. Since this time, many different implementations of SAR have been realised. SAR is

recognised as a branch of AR which defies AR conventions by enhancing the user’s real world with

virtual objects that are integrated into the user’s real world environment, rather than requiring the

use of a head mounted display to view enhancements to the real world. This integration can be

undertaken using a projector, as in the research described in this dissertation, or by other methods

such as the integration of a flat panel display directly into the environment (Raskar, R, Welch, G &

Chen, W 1999).

The Shader Lamps system is an early SAR implementation (Raskar, R, Welch, G & Chen, W

1999). Shader Lamps utilised projectors to illuminate physical models of buildings. These

buildings, situated on a tabletop, were scaled models, white in colour, upon which textures could be 9

Page 20: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

projected. This implementation allowed for a flexible fast-prototyping method for architects and

designers. Figure 2 depicts the Shader Lamps system. The physical object is augmented with

textures via the use of projectors. AR can also be used for annotating environments (Feiner,

Macintyre & Seligmann 1993). The research presented in this dissertation utilises a combination of

both.

Figure 2 A Vase (Left), and the Vase Being Augmented by Projectors (Right) (Raskar, R et al. 2001)

The SAR research described in this dissertation utilises both projectors and cameras. Projectors

have been used in conjunction with cameras in previous related work (Bimber et al. 2003; Cotting

et al. 2004). Bimber et al. (2003) utilise optical see through displays to allow a user to view

physical and virtual objects with a consistent illumination made possible by techniques employing

the use of projectors and cameras. Cotting et al. (2004) also utilise projectors and cameras in their

method of embedding binary patterns into colour images displayed by a projector; the patterns are

visible to cameras synchronised with the projectors.

Implementations of augmented desks have been developed, which have utilised both projectors and

cameras to enhance surfaces and physical objects with virtual information (Wellner 1991, 1993;

Ishii & Ullmer 1997). The metaDESK utilised cameras and projectors to allow a user to manipulate

a map projected upon a desk (Ishii & Ullmer 1997). The DigitalDesk aimed to add electronic

features to ordinary paper. A camera enables a user to point at a portion of the desk and have the

system read the document located on the desk. A projector is utilised to display feedback to the

user (Wellner 1991). These are fixed setups where the surface being projected upon is in a fixed

position. However, research has been undertaken to allow a projection upon the surface of a

tracked physical object to be aligned whilst the object is moved (Ehnes, Hirota & Hirose 2004).

2.1 ClassificationThere has been a distinction between visual displays to produce a taxonomy for the area of AR

(Milgram & Kishino 1994). This taxonomy classifies AR implementations based on several

10

Page 21: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

concepts (Milgram et al. 1994). With regards to these classifications, the research described in this

dissertation would be considered to be partially immersive.

Figure 3 The Reality-Virtuality Continuum (Drascic & Milgram 1996)

The research described in this dissertation would be considered to be located towards the left of the

Reality-Virtuality Continuum (Drascic & Milgram 1996) depicted in Figure 3. This is apparent as

this research relies heavily on reality, whilst applying some augmentations. AR has also been

categorised into three different augmentation approaches: user augmentation, object augmentation

and environment augmentation (Mackay 1998). User augmentation requires the use of head

mounted devices, object augmentation requires the embedding of devices within objects and

environmental augmentation requires the projecting of images within the environment. This

research falls within environment augmentation approach. AR research can be classified in terms of

the display technique employed. These techniques are see-through head-mounted displays,

projection-based displays and handheld displays. The research described is a projection-based

display.

2.2 BenefitsThe advantages of utilising SAR in preference to other techniques that fall within the areas of AR

and VR are well documented (Raskar, R, Welch, G & Chen, W 1999; Raskar, R & Low 2001). As

SAR is a hybrid of the physical and virtual environments, using SAR allows for advantages to be

drawn from each (Raskar, R et al. 2001).

The research described in this dissertation has the goal of instructing a user to arrange physical

objects. There are techniques in AR, VR and SAR that can allow this goal to be attained. Utilising

a physical model as in SAR allows for a model which is higher resolution, easier for the user to

view, more responsive and therefore better than a virtual model such as those used in VR.

However, VR allows for the user to be shown anything (Raskar, R et al. 2001); including scenarios

which may not make sense in the physical world. This may be beneficial to some applications;

however it is not a property that is useful to the research described in this dissertation.11

Page 22: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

A user’s physical environment can be enhanced with virtual objects by the utilisation of AR video

and optical see through techniques. However, these techniques require the user to wear a head

mounted display, which is cumbersome. The ability to enhance the user’s environment with a

projector rather than the user having to physically utilise a head mounted display is the main

advantage of SAR over traditional AR (Raskar, R, Welch & Fuchs 1998). Utilising projectors also

negates the need to compensate for the user’s head movements (Voida et al. 2005) and levels of

user fatigue are lower (Bryson et al. 1996). Multiple users are able to view an enhancement to the

physical world using SAR, however in AR and VR, each user must utilise an additional head

mounted device.

Using VR or AR with video see through techniques requires the user to view the environment with

a limited resolution and at a limited frame rate. However, when utilising SAR, the visual fidelity of

the environment remains unchanged, and only the enhancements integrated into the environment

have any such limitations (Raskar, R, Welch, G & Chen, W 1999).

In addition, when compared to techniques that require the use of head mounted displays, the use of

SAR allows for large images to be generated which may span a user’s field of view. This provides

increased integration of virtual objects into the user’s real world. SAR can also allow for increased

immersion into the environment and improved user interaction (Raskar, R, Welch & Fuchs 1998).

SAR allows for virtual objects to be rendered closer to their intended real world location, which

allows for improved accommodation for the human eye (Raskar, R, Welch, G & Chen, W 1999).

Mobile Augmented Reality is another branch of AR, which utilises mobile devices. Mobile AR

restricts the resolution of the user’s world to the resolution of the device being used (Feiner et al.

1997).

Feiner et al. (1997) state that:

“We feel that augmented reality systems will become commonplace only when they

truly add to reality, rather than subtract from it” [as the limited resolution of mobile

devices does] (Feiner et al. 1997, p.5).

The disadvantage of being restricted to a low resolution device is avoided when using projectors in

SAR. As well as allowing for a higher resolution, SAR also negates the need for a user to hold an

interaction device. As mentioned previously regarding AR, multiple users are able to view an

enhancement to the physical world using SAR, however in Mobile AR, multiple mobile devices

may be needed.

12

Page 23: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

2.3 Object ManipulationPrevious research in the area of object manipulation in AR environments has identified a lack of

research in the area of physical object manipulation (Kitamura, Ogata & Kishino 2002). However,

physical objects have been used to achieve virtual object manipulation (Gorbet, Orth & Ishii 1998;

Fjeld 1999; Kitamura, Itoh & Kishino 2001; Rekimoto, Ullmer & Oba 2001; Marner & Thomas

2010).

Previous AR work that does involve both physical and virtual objects focuses on applications that

vary from the research described in this dissertation. One such implementation involves applying

limited physics laws upon virtual objects so that consolidated manipulation techniques can be taken

upon both physical and virtual objects (Kitamura & Kishino 1997). Another implementation

involves occlusion and collision calculation with both physical and virtual objects (Breen et al.

1996).

Users have been equipped with a see-through HMD, allowing them to have their view of the real

world enhanced with virtual objects which can be manipulated (Webster et al. 1996). The objects

were visualised on a virtual sphere surrounding the user. A similar implementation was developed

whereby users also equipped with HMDs were able to arrange virtual objects within these spaces,

but were not restricted to a sphere (Wang & Gong 2007). The Designer’s Augmented Reality

Toolkit (DART) was devised to allow the arrangement of virtual objects upon the user’s view of

their physical environment (MacIntyre et al. 2004). A distributed AR system was developed

whereby multiple users were able to arrange virtual objects upon a video stream (Ahlers et al.

1995).

A SAR system was developed entitled BUILD-IT (Fjeld 1999). This tabletop system allowed users

to interact with real objects to control a virtual world, as depicted in Figure 4. Two views of this

world were projected; one on the table and one on the wall. The users were able to manipulate and

control the views using several special real objects. A virtual object was selected when a user

placed a real object at the position of a virtual object. The virtual object could then be positioned,

rotated and fixed by the user manipulating the real object. Subsequent SAR systems have been

developed that implement refinements to such a system, including a restructuring such that the user

no longer has to turn their head to view a wall projection (Gausemeier, Fruend & Matysczok 2002).

13

Page 24: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 4 Virtual Object Manipulation Using Physical Objects (Fjeld 1999)

The Multiuser Augmented Reality Environment (MARE) tabletop AR system was developed

(Grasset & Gascuel 2002). It allows multiple users with HMDs to manipulate virtual objects using

real objects in a shared space. As HMDs were used, this implementation allowed for research into a

private space for each user to keep their own private virtual objects, and measures for access

controls. AR systems such as The Studierstube Project (Schmalstieg, D, Fuhrmann, A, Hesina, G,

Szalavari, Z, Encarnacao, L.M, Gervautz, M & Purgathofer, W 2002) allow interaction with

multiple users. Using SAR with projectors does not allow for each user to see a different display.

However, multiuser interaction and viewing can be achieved using SAR with projectors, without

each user requiring a HMD or a mobile device.

Implementations have been developed to allow a user equipped with a HMD to manipulate virtual

furniture in a virtual room (Kato, H, Billinghurst, M, Poupyrev, K, Imamoto, K & Tachibana, K

2000; Irawati 2006b, a). Irawati et al. (2006) utilised a multimodal user interface which utilised

speech and gestures with a physical paddle to arrange the virtual furniture objects. Similar

implementations exist which do not utilise multimodal input (Kato, H, Billinghurst, M, Poupyrev,

K, Imamoto, K & Tachibana, K 2000). The research described in this dissertation addresses a

similar problem; the arrangement of objects. The research proposed in this dissertation performs a

direct manipulation which will integrate the real and virtual worlds further by using real objects at

their typical scale, rather than scaled down virtual representations.

14

Page 25: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

2.4 User InteractionShortcomings have been identified in user interface interaction in applications that utilise the

physical environment (Rekimoto & Nagao 1995). Rekimoto et al. (1995) stated that the focus is on

the user’s interaction with the computer device, rather than the human’s interaction with the

physical environment.

Research has been undertaken to address into the integration of information with the user’s

environment within AR user interface implementations (Höllerer 2001). One of the outcomes of

this research was the establishment of several methods in which an AR user interface can be

managed. An AR user interface can be managed by:

filtering information to show only the relevant information,

formatting the information suitably, and

correctly managing virtual objects [ensuring that virtual objects are laid out appropriately

in the real environment] (Höllerer 2001).

These methods were taken into account for the development of the SAR visual cues presented in

this dissertation. However, the user interfaces developed by Höllerer et al. (2001) resemble the user

interfaces of two dimensional applications and there may be more research that can be undertaken

to make the user interfaces more suited to three dimensional applications. Many early AR

implementations focused on registration with the environment and correct display of information,

and less on how users interact with the system. Thus, these systems also had user interfaces that

were based on the two dimensional desktop metaphor (Azuma, R et al. 2001).

Research into interaction techniques with projector based AR implementations has been undertaken

(Raskar, R & Low 2001). Physical object illumination has been achieved upon moving objects and

this has allowed for the development of a three dimensional painting system which allows a user to

paint textures with a tracked paintbrush upon a tracked moving object that is held by the user. The

paintbrush and object are projected upon with two projectors in real time and are tracked

independently by sensors that are attached to them. The user is able to change the virtual colour and

shape of the brush. The resulting coloured model is able to be stored to work on at a later date.

Physical objects have been used to facilitate user interaction with virtual information (Ishii &

Ullmer 1997; Szalavári & Gervautz 1997). The use of physical objects in this way has been

referred to as Tangible User Interfaces (Ishii & Ullmer 1997). The aforementioned metaDESK

allows interaction with a map by moving several real objects which correspond with landmarks on

the map, and a Luminous Table has been developed which integrates 2D drawings and real objects,

15

Page 26: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

and can be used to complete urban design tasks (Ishii et al. 2002). Figure 5 depicts an overview of

the components that form the metaDESK.

Figure 5 The metaDESK System (Ullmer & Ishii 1997)

The Personal Interaction Panel was introduced and developed as an interaction device that is an

extension of a physical notebook, and it allows for a user to perform AR manipulation tasks

(Szalavári & Gervautz 1997; Schmalstieg, D, Fuhrmann & Hesina 2000). Building upon the notion

of Tangible User Interfaces, occlusion techniques have also been utilised in AR user interfaces.

Conventional interaction methods such as the pushing of a button can be achieved with the use of

occlusion (Lee 2004). Thus, by the user physically pressing the button, a marker was covered,

indicating that the button was pushed.

A system has been devised whereby a user is instructed on how to play billiards with the use of

projected visual cues (Suganuma et al. 2008). A camera enables the detection of the location of the

billiard balls and a projector then projects instructions to the user on how the white ball should be

hit and the direction the ball will travel. Further instructions are provided to allow the user to

assume a correct posture. One of these instructions consists of a projected rectangle in which the

user should place one of their hands. Another of these instructions consists of a projected circular

shape, which prompts the user to move their head to a position where the circular shape is

occluded.

16

Page 27: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

2.5 TrackingThe research described in this dissertation requires physical objects to be tracked in order to allow

for their arrangement. The research does not aim to develop a unique tracking method; rather an

existing tracking framework is utilised as a component that provides infrastructure.

The ARToolKit tracking framework (Kato, H & Billinghurst 1999) has been used to allow users to

create and manipulate 3D models by using a keyboard and mouse in a tracked workspace (Do &

Lee 2010). The placement of markers determined the action that would be performed such as

creating, selecting and editing a 3D model. The aforementioned study into interaction with tangible

user interfaces made use of ARToolKit for implementing an occlusion technique for the pressing of

a virtual button (Lee 2004). The accuracy of ARToolKit has been measured over distances of one

to three metres (Malbezin, Piekarski & Thomas 2002). This was undertaken by comparing the

physically measured distance from a camera to a marker with the distance extracted from

ARToolKit. Results show an error with the accuracy of the marker's position that increases as the

distance between the camera and the marker increases. The error also varies in the x axis and the y

axis. However, a comparison has been undertaken that determined that, compared with several

other tracking frameworks, ARToolKit provided the best performance for detecting a marker from

a great distance (Zhang, Fronz & Navab 2002). The outcome of their research indicated that the

best existing tracking framework to use depended on the given situation, but ARToolKit performed

well in most situations.

Other tracking methods have been used; the aforementioned BUILD-IT system used bricks covered

in paper that could reflect infra red light which could then be detected by an infra-red camera (Fjeld

1999). It has been claimed that there is an apparent recent move towards utilising model based

tracking in AR (Zhou, Duh & Billinghurst 2008). Model-based tracking systems use

distinguishable features gathered from a model of the object to implement tracking. A method of

model-based tracking has been devised which allows camera pose to be calculated from several

different features of models such as lines, cylinders and spheres. This research focused on

achieving tracking with a low error rate, which was aided by having an increased knowledge of

objects present in the scene.

Research has been undertaken into using natural features of objects for tracking, instead of

introducing artificial markers to the scene (Park, You & Neumann 1999). This method of tracking

requires the camera pose calculation to be undertaken from known features, but then the system is

able to dynamically add features such as textures and corners detected in the scene to aid in

updating the pose calculation.

17

Page 28: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Tracking has also been achieved using encoded LEDs instead of paper based markers (Naimark &

Foxlin 2005). The use of LED tracking is beneficial in several situations, such as for applications

on a small scale which require high precision tracking, and for tracking in the dark.

In recent years, there has been a trend of research focused on vision-based tracking techniques

rather than sensor-based tracking techniques (Zhou, Duh & Billinghurst 2008). However, vision

based techniques are more computationally demanding, and often have errors caused by factors

such as occlusion (You, Neumann & Azuma 1999). In one of the only sensor-based, non-camera

based tracking techniques to be developed in recent years, ultrasonic sensors are used for tracking

indoors with the Bat system (Newman 2001). When utilising the Bat system, users are fitted with

small wireless devices called Bats. Receivers are located in known, fixed positions on ceilings and

the Bats emit signals which are detected by these receivers. The distance from the Bats to the

receiver can be calculated from the time taken for the signals to be received.

18

Page 29: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

3 SAR Object Arrangement Techniques

This section presents the SAR visual cues developed. The majority of the SAR visual cues

implement a Translation Instruction and a Rotation Instruction and follow a developed process for

arranging objects to a desired position, all of which are described.

Translation and rotation have been identified as instructions that are to be performed to arrange a

physical object to a desired position, and thus a Translation Instruction and a Rotation Instruction

have been developed. These two instructions act as building blocks; they are utilised for the

majority of the SAR visual cues, either on their own or incorporated with other augmentations. The

Translation Instruction is described in Section 3.2.1 and the Rotation Instruction is described in

Section 3.2.2. A process has been developed to instruct the arrangement of a physical object to a

desired position by a series of moves utilising the Translation and Rotation instructions; this

process is described in Section 3.2 and the majority of the SAR visual cues developed follow this

process.

A set of SAR visual cues have been developed, each providing various advantages and a different

user experience; they are described in Section 3.2.3. The intention is to develop a variety of SAR

visual cues with different strengths, and then assess their performance with the user study described

in Chapter 4. It is envisaged that these SAR visual cues can be applied to different physical objects

in a number of different application domains, such as manufacturing, warehousing or theatrical

sets. A discussion of the SAR visual cues developed is provided in Section 3.2.4. When

considering the theoretical ideas of the SAR visual cues versus their implementations, various

differences are evident which are also discussed. This section begins with a description, in Section

3.1, of user interaction techniques which enable users to arrange physical objects using SAR.

3.1 Interaction

A top down arrangement application has been developed, from which SAR visual cues for physical

object manipulation can be generated. The application provides a user interface in which the user

19

Page 30: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

arranges representations of physical objects from a top down view. The resulting arrangement can

then be performed on the physical objects using SAR visual cues.

This process can be undertaken in real time by one user or as a collaboration task. This process can

also be stored to provide instructions on demand at a later date. When collaborating on a task, a

remote user can provide arrangement instructions of objects to a user physically present with the

physical objects. For example, the remote user can specify the desired arrangement of the physical

objects in the top down arrangement application, and then have this arrangement carried out by a

second party who follow the resulting SAR visual cues to achieve the arrangement that has been

specified. The instructions can also be stored for use at a later date. For example, a user arranging

theatre sets may want to store a set of specific arrangements for the objects that can be utilised

when the need arises.

3.2 Object Arrangement Techniques

The overall system has a notion that the objects to be arranged are constrained to the ground plane.

This notion does restrict the movement of physical objects to a single planar surface, such as a

floor or table-top, but does allow for simpler SAR visual cues to be provided to the user. This

design decision was developed to support the arrangement of large physical objects such as

furniture or theatre sets. The rotation instructions are performed within the tracked physical

object’s local coordinate system whereas the translation instructions are performed within the

world coordinate system.

The instructions are carried out systematically; the user begins with one instruction, a translation or

a rotation. For example, if the translation is undertaken first, then the Translation Instruction is

displayed to the user guide the user to move the physical object to a position that corresponds with

the desired final translation. Once the user has moved the physical object so that the translation is

correct, the Translation Instruction ceases to be displayed. The Rotation Instruction is then

displayed to guide the user to rotate the physical object to a position that corresponds with the

desired final rotation. Once the user has moved the physical object to a position that corresponds

with the desired final translation and rotation, the physical object has been successfully arranged.

Once both the rotation and translation instructions have been completed successfully, the SAR

visual cue will no longer be displayed, indicating that the physical object is successfully arranged.

A state machine of the process can be seen in Figure 6. Once the physical object has been arranged,

the next physical object can be arranged if there are multiple objects to consider.

20

Page 31: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 6 Object Arrangement Process

Section 3.2.1 describes an instruction for the translation of a tracked physical object. Similarly,

Section 3.2.2 describes an instruction for the rotation of a tracked physical object. The utilisation of

these instructions or variations of these instructions, to arrange an object is referred to as a SAR

visual cue. A discussion of the SAR visual cues developed is provided in Section 3.2.3.

3.2.1 Translation Instruction

Translation is achieved by instructing the user to move a tracked physical object on a ground plane,

along the x and y axes, in a z axis up world. The translation of the tracked physical object is

undertaken within the world coordinate system. A full range of translations are possible along the x

and y axes, with a portion of the projection following the tracked physical object. Figure 7, Figure

8 and Figure 9 depict, from a top down view, the process of undertaking a translation using the

translation instruction. The tracked physical object is represented by a solid shape whilst the

desired final position of the object is represented by a dotted shape.

21

Page 32: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 7 Translation Instruction: Starting Position.

The translation instruction shows a circular projection upon the tracked object to be arranged, and

the same circular shape projection at the desired final position of the object. The two circular

shapes are connected with a line which serves to show the user a path to the circle located at the

physical object’s desired final position. This line is useful for situations when the desired final

position is difficult to locate. For example, the desired final position may be located at a distance

from the object’s current position. It is also useful in situations when the shape of the physical

object can obscure the projection at the desired final position.

Figure 8 Translation Instruction: Position After Part of the Translation has been Completed.

As the user translates the tracked physical object, the projection displayed is updated in accordance

to the object’s new position.

22

Page 33: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 9 Translation Instruction: Final Position.

The user moves the tracked object until the circular shapes are aligned; once the object has been

successfully translated, the instruction will cease to be displayed. This is depicted in Figure 9.

3.2.2 Rotation Instruction

Rotation is achieved by instructing the user rotate a tracked physical object on a ground plane,

around a centre axis. The rotation of the tracked physical object is in the object’s local coordinate

system. A 360 degree range of rotation can be achieved. Translating the object whilst the rotation

instruction is displayed does not cause a conflict, as the rotation projection follows the tracked

object.

Figure 10, Figure 11 and Figure 12 depict, from a top down view, the process of undertaking a

rotation using the rotation instruction. The tracked physical object is represented by a solid shape

whilst the desired final position of the object is represented by a dotted shape.

23

Page 34: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 10 Rotation Instruction: Starting Position.

The rotation instruction developed follows a similar structure to that of the translation instruction.

The rotation instruction shows a two lines originating from the centre of the tracked physical

object. One of the lines is projected along the x axis of the object, and follows this axis in

accordance to the rotation that the user is performing. The other line is immovable and represents

the destination rotation which can be achieved when the movable line is aligned with the

immovable line.

In Figure 10 and Figure 11, the immovable line is represented by the upper line of the projection in

the diagrams. In Figure 12, the lines are indeterminable as they overlap. A circular projection is

projected at the end of each of the two lines. The appearance of the lines from the centre of the

object can be likened to the hands of a clock.

Figure 11 Rotation Instruction: Position After Part of the Rotation has been Completed.

24

Page 35: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

As the user rotates the tracked physical object, the projection displayed is updated in accordance to

the object’s new position.

Figure 12 Rotation Instruction: Final Position

The user rotates the tracked object until the movable line is aligned with the immovable line, which

is also the case when the circular shapes are aligned; once the object has been successfully rotated,

the instruction will cease to be displayed. This is depicted in Figure 12.

3.2.3 SAR Visual Cues

A set of SAR visual cues have been developed which allow for different experiences when

arranging tracked physical objects. The majority of the SAR visual cues utilise the Translation

Instruction as described in Section 3.2.1 and the Rotation Instruction as described in Section 3.2.2,

however the Circle Cue implements variations of the translation and rotation instructions. Many of

the SAR visual cues developed have additional augmentations projected with the goal of aiding the

physical object arrangement process.

The SAR visual cues implement different advantages which are summarised in Table 1; for

example, the Circle Cue allowed for the most simplistic projection which may be advantageous in

some situations, but this simple projection also conveyed the least information to the user. The

Wireframe Cue allowed for a larger amount of information to be conveyed to the user; wireframe

projections were present which the user could align. The presence of these wireframe projections

allow for rotation information to be conveyed at all times through the orientation of the wireframe

projections, however the use of the wireframe projections may result in the SAR visual cue

seeming cluttered.

25

Page 36: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

SAR Visual Cue and Description Advantages Disadvantages

Translation First Cue: Utilises

only the Translation and Rotation

Instructions

Simplistic No further projections are

displayed, so no extra

information is conveyed.

Rotation First Cue: Utilises only

the Translation Instruction and

Rotation Instruction

Simplistic. No further projections are

displayed, so no extra

information is conveyed.

Both Instructions Cue: Utilises

only the Translation Instruction and

Rotation Instruction, displayed at

the same time.

User can perform the instructions in

either order.

More cluttered than many of the

other SAR visual cues.

Circle Cue: Translation and

rotation are achieved with only the

use of a circle projection.

Very simplistic. Conveys less information than

the other SAR visual cues.

Wireframe Cue: The Translation

and Rotation Instructions are used,

along with two wireframe

projections that can be aligned.

The wireframe projections convey both

translation and rotation information at

all times, and this is shown in addition

to the Translation and Rotation

Instructions.

More cluttered than many of the

other SAR visual cues.

Square Cue: The Translation and

Rotation Instructions are used,

along with two square projections

that can be aligned.

The square projections convey both

translation information and some

rotation information at all times, and

this is shown in addition to the

Translation and Rotation Instructions.

Does not convey as much

rotation information as the

Wireframe Cue.

Table 1 SAR Visual Cue Overview of Advantages and Disadvantages

The intention is to develop a variety of SAR visual cues with different strengths, and then assess

their performance with the user study described in Chapter 4; a process which results in a

recommendation of the best SAR visual cue developed. In addition to the visual cues described, a

revised SAR visual cue has been developed, taking into consideration the results of the user study.

Information regarding the revised SAR visual cue is presented in Section 6.

26

Page 37: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

3.2.3.1 Translation First Cue

The Translation Instruction and Rotation Instruction are utilised, with no further augmentations

present. The Translation Instruction is projected in yellow and the Rotation Instruction is projected

in green. The SAR visual cue follows the process described by Figure 6 with the Translation

Instruction occurring first.

When compared to many of the other SAR visual cues developed, such as the Wireframe Cue

described in Section 3.2.3.5, the Translation First Cue achieves a higher level of simplicity as no

further augmentations are provided to the SAR visual cue; only the Translation and Rotation

Instructions are utilised. However, the Translation First Cue is not as simplistic as the Circle Cue

described in Section 3.2.3.4.

Figure 13 Translation First Cue: Translation (the Initial State)

Figure 13 depicts the initial state of the physical object. The implementation of this initial state is

presented in Figure 14. The initial state of the Translation First Cue is the Translation Instruction

presented in Section 3.2.1.

27

Page 38: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 14 Translation Cue: Translation Implementation (the Initial State)

The Translation Instruction is displayed to guide the user to move the physical object to a position

that corresponds with the translation coordinates of the desired final position. In the scenario

presented by Figure 14, the user must translate the physical object towards the camera, so that the

circles are aligned. This can be achieved by following the line projection. Once the user has moved

the physical object so that the translation is correct, the Translation Instruction ceases to be

displayed.

28

Page 39: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 15 Translation First Cue: Rotation

Figure 15 depicts the state of the physical object once the initial translation has been completed and

rotation is now being performed. The implementation of this state is presented in Figure 16. This

state corresponds with the Rotation Instruction presented in Section 3.2.2.

Figure 16 Translation First Cue: Rotation Implementation

29

Page 40: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

The Rotation Instruction is now displayed to guide the user to rotate the physical object to a

position that corresponds with the desired final rotation. In the scenario presented by Figure 16, the

user must rotate the object further clockwise, so that the two projected lines are aligned. Once the

user has moved the physical object to a position that corresponds with the translation and rotation

of the desired final position, the SAR visual cue will no longer be displayed, indicating that the

physical object has been successfully arranged.

3.2.3.2 Rotation First Cue

The Rotation First Cue is similar to the Translation First Cue, however the deviation is that the

Rotation Instruction is displayed first, followed by the Translation Instruction. This order of the

instructions is continued until the physical object is correctly aligned. The Rotation First Cue has

the advantages specified for the Translation First Cue in Section 3.2.3.1. The Rotation First Cue

and Translation First Cue were both implemented to provide some variety and discover if one order

of the instructions is more effective than the other.

3.2.3.3 Both Instructions Cue

The Both Instructions Cue is similar to the aforementioned SAR visual cues, however the variation

is that both the Translation Instruction and Rotation Instruction are displayed at the same time.

The Both Instructions Cue is more cluttered than the Translation First Cue described in Section

3.2.3.1 and Rotation First Cue described in Section 3.2.3.2 as it displays both of the instructions at

the same time. However, this allows for more flexibility in the method that the user undertakes

when arranging the object. Displaying both translation and rotation information at the same time

offers the advantage of the user being able to choose which instruction to perform first as in some

situations it may be better to perform the translation first and in other situations it may be more

efficient to perform the rotation first.

30

Page 41: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 17 Both Instructions Cue: Scenario of Translation Performed First

For example, the user may choose to undertake the translation of the physical object first. In this

case, the user will utilise the Translation Instruction to move the physical object to a position that

corresponds with the desired final translation. Once the user has moved the physical object so that

the translation is correct, the user can then rotate the object according to the Rotation Instruction.

The Translation Instruction continues to be displayed although the translation has been successfully

completed. This allows the user to correct the translation of the object if the translation becomes

incorrect whilst the rotation is being performed. This scenario is depicted in Figure 17.

Figure 18 Both Instructions Cue: Scenario of Rotation Performed First

Alternatively, the user may choose to undertake the rotation of the physical object first. In this case,

the user will utilise the Rotation Instruction to rotate the physical object to a position that

corresponds with the desired final rotation. Once the user has rotated the physical object so that the

31

Page 42: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

rotation is correct, the user can then translate the object according to the Translation Instruction.

The Rotation Instruction continues to be displayed although the rotation has been successfully

completed. This allows the user to correct the rotation of the object if the rotation becomes

incorrect whilst the translation is being performed. This scenario is depicted in Figure 18.

Figure 19 Both Instructions Cue: Implementation

Figure 19 presents the implementation of the Both Instructions Cue. In this scenario the user has

rotated the physical object such that the rotation is almost correct. This is evident by the two lines

of the Rotation Instruction being almost aligned. In order to successfully arrange the physical

object, the user must now translate the object in a south westerly direction and ensure that the

rotation is aligned.

Once the user has moved the physical object to a position that corresponds with the desired final

translation and rotation, the SAR visual cue will no longer be displayed, indicating that the physical

object has been successfully arranged.

3.2.3.4 Circle Cue

The Translation Instruction and Rotation Instruction are not utilised for this SAR visual cue. The

SAR visual cue follows the process described by Figure 6 with translation occurring first. This

SAR visual cue allows for simplicity and was designed with the aim that the circle projection will

32

Page 43: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

be clearer when projected upon a variety of different physical objects, as compared with SAR

visual cues that incorporate more complicated projections.

Figure 20 Circle Cue: Translation (the Initial State)

Figure 20 depicts the initial state of the physical object. The implementation of this initial state is

presented in Figure 21.

Figure 21 Circle Cue: Translation Implementation (the Initial State)

33

Page 44: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

The only projection utilised is of a circle. This circle has a radius larger than the circle projection

used for the Translation Instruction and Rotation Instruction. The circle is displayed upon the

tracked physical object and at the desired final position to guide the user to move the physical

object to a position that corresponds with the translation coordinates of the object’s desired final

position. This is achieved in a similar fashion to the Translation Instruction; the user moves the

object so that the two circles are aligned. In the scenario presented by Figure 21, the user must

translate the object in a south westerly direction, so that the two projected circles are aligned. Once

the user has moved the physical object so that the translation is correct, the two circles cease to be

displayed.

Figure 22 Circle Cue: Rotation

Figure 22 depicts the state of the physical object once the initial translation has been completed and

rotation is now being performed. The implementation of this state is presented in Figure 23.

34

Page 45: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 23 Circle Cue: Rotation Implementation

A variation of the Rotation Instruction is displayed which consists of one circle of a differing

colour; green. This green circle is displayed upon the physical object to guide the user to rotate the

physical object to a position that corresponds with the desired final rotation. This is achieved by the

user rotating the object until the green circle is no longer displayed. Once the user has moved the

physical object to a position that corresponds with the desired final translation and rotation, the

SAR visual cue will no longer be displayed, indicating that the physical object has been

successfully arranged.

The user is not given any more visual cues to aid the rotation other than the circle, and the

indication that the rotation is correct. This results in a simplistic and consistent projection being

displayed when utilising this SAR visual cue, but the lack of information is a disadvantage to this

increased simplicity.

3.2.3.5 Wireframe Cue

The Wireframe Cue is similar to the Translation First Cue, however the deviation is that two

wireframe projections are present, which are projected in grey. One wireframe projection is aligned

with the physical object and follows the physical object as it is moved. The other wireframe

projection is present at the desired final position. The existence of the wireframe projections may

assist the user in arranging the object. The wireframe projections convey both rotation and

translation information to the user, regardless of which state the arrangement process is in. The user 35

Page 46: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

is aided by being able to align the wireframe projection following the physical object with the static

wireframe projection located at the desired final position, as when the two wireframe projections

are aligned, the physical object is successfully arranged.

Figure 24 Wireframe Cue: Translation Implementation (the Initial State)

Figure 24 depicts the initial state of the physical object. The implementation of this initial state is

presented in Figure 25.

36

Page 47: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 25 Wireframe Cue: Translation (the Initial State)

As with the Translation First Cue described in Section 3.2.3.1, the Translation Instruction is

displayed and, in addition, the wireframe projections are displayed. In the scenario presented by

Figure 25, the user must translate the object in a south westerly direction. This can be achieved by

using the Translation Instruction, and can also be achieved by aligning the two wireframe

projections.

Figure 26 Wireframe Cue: Rotation

37

Page 48: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 26 depicts the state of the physical object once the initial translation has been completed and

rotation is now being performed. The implementation of this state is presented in Figure 27.

Figure 27 Wireframe Cue: Rotation Implementation

As with the Translation First Cue described in Section 3.2.3.1, the Rotation Instruction is displayed

and, in addition, the wireframe projections remain present. In the scenario presented by Figure 27,

the user must rotate the object further clockwise. This can be achieved by using the Rotation

Instruction, and can also be achieved by aligning the two wireframe projections.

3.2.3.6 Square Cue

The Square Cue is similar to the Translation First Cue, however the deviation is that two square

projections are present, which are projected in cyan. Thus, the Square Cue is the same as the

Wireframe Cue, but with square projections instead of wireframe projections. One square

projection is aligned with the centre of the top of the physical object and follows the physical

object as it is moved. The other square projection is present at the desired final position.

The existence of the square projections may assist the user in arranging the object. The square

projections convey translation information and some rotation information to the user, regardless of

which state the arrangement process is in. The user is aided by being able to align the square

projection following the physical object with the static square projection located at the desired final

position, as when the user translates the physical object so that the two square projections are

38

Page 49: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

aligned, the physical object is successfully translated and the rotation is either correct or it may

have a variation of either 90, 180 or 270 degrees, which can then be corrected by the user when the

Rotation Instruction is displayed. This SAR visual cue allows for a degree of simplicity as the only

projections present at one time are the two square projections and one of the Translation Instruction

or the Rotation Instruction. It does not convey as much information as the Wireframe Cue

described in Section 3.2.3.5 as a more exact rotation can be gathered from the wireframe

projections, however the Square Cue is comparatively less cluttered. In comparison to the Circle

Cue described in Section 3.2.3.4, the Square Cue offers more information but is more cluttered and

less simplistic.

Figure 28 Square Cue: Translation (the Initial State)

Figure 28 depicts the initial state of the physical object. The implementation of this initial state is

presented in Figure 29.

39

Page 50: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 29 Square Cue: Translation Implementation (the Initial State)

As with the Translation First Cue described in Section 3.2.3.1, the Translation Instruction is

displayed and, in addition, the square projections are displayed. In the scenario presented by Figure

29, the user must translate the object in a south westerly direction, so that the circles are aligned.

This can be achieved by using the Translation Instruction, and can also be achieved by aligning the

two square projections.

40

Page 51: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 30 Square Cue: Rotation

Figure 30 depicts the state of the physical object once the initial translation has been completed and

rotation is now being performed. The implementation of this state is presented in Figure 31.

Figure 31 Square Cue: Rotation Implementation

As with the Translation First Cue described in Section 3.2.3.1, the Rotation Instruction is displayed

and, in addition, the square projections remain present. In the scenario presented by Figure 31, the

41

Page 52: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

user must rotate the object clockwise. This can be achieved by using the Rotation Instruction, and

may be aided by using the two square projections.

3.2.4 Discussion

Multiple SAR visual cues have been developed to provide variety and different advantages. Many

of the SAR visual cues undertake the object arrangement process outlined in Figure 6 with either

the translation or rotation occurring first. However, the Both Instructions Cue described in Section

3.2.3.3 displays both translation and rotation information at the same time. As stated, this allows

for more flexibility in the method that the user undertakes when arranging the object and may be

advantageous.

For example, a situation in which the Both Instructions Cue may be advantageous is if the user

must navigate a couch through a doorway in order to reach the desired final position. In this

situation, performing the translation first would be the most efficient method, as any rotation

applied to the couch may need to be undone as the couch may need to be rotated in a different way

to fit through the doorway. In this situation, the user can decide that the translation should be

undertaken first, and can choose to perform the translation followed by the rotation.

If the same example were undertaken with the Rotation First Cue described in Section 3.2.3.2, the

user would first rotate the couch, and then translate it through the doorway. In order to translate the

couch through the doorway, the couch must be rotated to a different position, undoing the rotation

already performed. Once the translation is correct, the Rotation Instruction is displayed once again.

Thus, the object will still be arranged to the desired destination position, but the rotation

undertaken first was not needed and thus the process was not efficient. While the SAR visual cues

developed may all allow the physical object to be correctly arranged, some have advantages in

different situations.

The design of the SAR visual cues described has lead to issues regarding the location of the

projections upon the physical object, in terms of the marker fixed to the physical object and the

distortion caused in several situations when utilising the SAR visual cues described. Projection

upon tracking markers is addressed in Section 3.2.4.1 and distortion is addressed in Section 3.2.4.2

3.2.4.1 Projection upon Tracking Markers

The SAR visual cues developed have been implemented in several bright colours in the interest of

being easier to see. As the tracking framework utilised required the use of markers attached to the

physical object, the projection overlapped the markers frequently. This made portions of the SAR

visual cues difficult to see, when projected upon the black of the markers. As this research focuses

on SAR visual cues for the arrangement of physical objects, there was no endeavour to consider

42

Page 53: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

limitations caused by the tracking framework. The SAR visual cues were not designed according to

the limitations of the tracking framework utilised. In future work, a tracking framework could be

utilised that does not require markers to be located in positions where projections are made.

3.2.4.2 Distortion

When considering both the theoretical ideas of these SAR visual cues versus their implementations,

differences are evident. These fundamental differences are clear when viewing Figure 7 which

depicts a top down diagram of the Translation Instruction and Figure 14 which depicts the

implementation translation of the Translation First Cue (which is also the Translation Instruction

with no further augmentations). In the implementations these visual cues, occlusion and distortions

are present.

The design of the visual cues has been undertaken with these fundamental SAR issues in mind. A

measure taken to limit the problems of occlusion and distortion is that the Translation Instruction

utilises a line between the two circles. In a situation where the circle projected at the desired final

translation cannot be seen, the user is able to follow the line and arrive at the desired final

translation.

Figure 32 SAR Visual Cue Distortion on the Side of the Physical Object

For example, when considering the translation scenario presented in Figure 32, the projection of

the circle at the desired final translation is distorted. If it is assumed that the user’s view of the

43

Page 54: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

physical object is from the camera position, it is evident that the object must be moved south in

order to complete the Translation Instruction. However, if the user were viewing this SAR visual

cue at the same angle but from the opposite side of the physical object, the user would not see the

circle projection at the desired final translation. This distortion scenario has been addressed by the

utilisation of the line projection between the physical objects current translation and the desired

final translation. The user is to move the physical object in the direction that the line is indicating.

As the physical object nears the desired final translation, the projection of the circle at the desired

final translation will be seen upon the top of the physical object, enabling the user to align the two

circles.

Another measure taken to limit the problems of occlusion and distortion is that the Rotation

Instruction utilises two lines from the centre point of the physical object. It can be imagined that

there is a vertical line through this point, which represents the axis around which the physical

object is rotated. Two lines are drawn from this centre point, and two circles that must be aligned;

one indicating the current rotation and another indicating the desired final rotation. In a situation

where one or more of the projected circles cannot be seen, the user is able to align the two lines and

still achieve the desired final rotation.

Figure 33 SAR Visual Cue Distortion from the Projector Position

When considering the rotation scenario presented in Figure 33, the projection of the circle

indicating the desired final rotation is distorted, in a manner that differs from the previous example.

44

Page 55: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

The location of the projector causes the projection to be disjointed. This has been addressed by the

utilisation of line projections from the centre of the physical object to the circles that must be

aligned in order to achieve the desired final rotation. As the user can see a portion of these lines as

they stem from the centre point of the object, the user is to rotate the physical object so that the

lines overlap. As the physical object nears the desired final rotation, the projection of the circle at

the desired final rotation will be seen upon the top of the physical object, allowing the user to align

the two circles.

45

Page 56: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

4 User Study

A user study was performed to determine:

the effectiveness, in terms of task time and accuracy, of different SAR visual cues versus a

manual method to instruct the arrangement of a physical object,

the intuitiveness of several different SAR visual cues, and

user opinion of the tasks undertaken.

This data was gathered to determine the effectiveness of the SAR visual cues developed, in

comparison to one another and in comparison to a manual object arrangement. A system was

utilised in accordance to the configuration specified in Section 4.1. The hypothesis of the user

study is that there is an improvement in speed and accuracy when using the SAR visual cues in

comparison to the manual arrangement method. The participants were invited to utilise this system

to undertake physical object arrangement tasks. These tasks are outlined in Section 4.2 and form

part of the structure outlined in Section 4.3. Section 4.4 describes the data that was gathered from

the user study, as well as the comparisons made with this data. Also discussed is the gathering of

data from both the manual timer and SAR timer, as the use of two timers may cause confusion. The

results of this user study are presented in Section 5.

4.1 ConfigurationThe system consisted of the components that are described in Chapter 8, however the top down

arrangement application was not utilised in the user study. One projector and one camera were

used. Figure 34 depicts the configuration of the components utilised for the user study.

46

Page 57: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 34 User Study Setup

Projector calibration was undertaken such that a one millimetre distance along a physical model

corresponded with a one unit distance along the corresponding virtual model. Thus, a unit in the

projector coordinate space corresponded with a millimetre in the physical world.

The participants were asked to arrange the same tracked physical object; a foam box with the

dimensions of 220 millimetres (length), 100 millimetres (width) and 120 millimetres (height). The

physical object was tracked using the ARToolKitPlus framework (Wagner & Schmalstieg 2007).

4.2 Tasks Undertaken for the User StudyOne of the tasks addressed physical object arrangement using a manual arrangement technique

which did not utilise SAR visual cues, and the remainder of the tasks addressed physical object

arrangement using SAR visual cues developed.

47

Page 58: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 35 Projected Grid Utilised in Task 1

The desired final positions of the physical objects were randomly generated. For Task 1 (No SAR),

the desired final positions were one of four randomly generated positions located on a grid, with

each position being one hundred projector units apart. For the tasks that utilised SAR visual cues,

this was a randomly generated position within an area of six hundred by six hundred projector

units.

Figure 36 The Grid used for Task 1 (left) and a Participant Undertaking Task 1 (right)

The tasks undertaken by each participant consisted of the following structure and in the following

order:

1. No SAR: A three by three grid was projected upon a ground plane, upon which the

participant was asked to place the centre point of the tracked physical object at one of four

positions that corresponded to points at the corners of the interior square of the grid. These

positions were located one hundred units apart, which corresponds to approximately one

48

Page 59: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

hundred millimetres according to the calibration method utilised. The projected grid did

not show any numbering; rather the numbers and corresponding locations were explained

to the participant verbally and with a demonstration of the physical object placed at each of

the four possible positions. The position of where the participant was to arrange the

physical object was randomly generated, as was the angle that the participant was to rotate

the object to. For example, a participant may be told to place the physical object at position

2, at 45 degrees. A protractor was available for use if desired. The grid was utilised as the

specified points were known coordinates. This allowed for a comparison between the

achieved position of the tracked physical object and the known coordinates. Figure 35

depicts a diagram of the grid used. Figure 36 presents the implementation of the grid used,

along with a participant arranging a physical object using the grid.

2. Translation First: The participant arranged the physical object to a randomly generated

position using the Translation First Cue specified in Section 3.2.3.1.

3. Rotation First: The participant arranged the physical object to a randomly generated

position using the Rotation First Cue specified in Section 3.2.3.2.

4. Both Instructions Cue: The participant arranged the physical object to a randomly

generated position using the Both Instructions Cue specified in Section 3.2.3.3.

5. Circle Cue: The participant arranged the physical object to a randomly generated position

using the Circle Cue specified in Section 3.2.3.4.

6. Wireframe Cue: The participant arranged the physical object to a randomly generated

position using the Wireframe Cue specified in Section 3.2.3.5.

7. Wireframe Cue with No Destination: The participant arranged the physical object to a

randomly generated position using the Wireframe Cue specified in Section 3.2.3.5, with the

variation that the wireframe upon the desired final position was not shown.

8. Square Cue: The participant arranged the physical object to a randomly generated position

using the Square Cue specified in Section 3.2.3.6.

9. Square Cue with No Destination: The participant arranged the physical object to a

randomly generated position using the Square Cue specified in Section 3.2.3.6, with the

variation that the square upon the desired final position was not shown.

On occasion, user study tasks were repeated if a problem occurred during the undertaking of the

task. This occurred on several occasions when participants inadvertently moved or removed the

tracking markers.

49

Page 60: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

4.3 StructureThe study was taken with one participant at a time. Each participant undertook the following

procedure:

1. The participant read the information sheet containing information of the study.

2. Any questions that the participant had regarding the study were addressed.

3. The participant read and signed the consent form.

4. The participant completed a survey designed to gather information about the participant,

such as age and their AR experience level.

5. The participant completed a survey to gather information on the intuitiveness of the SAR

visual cues.

6. The participant completed nine tasks adhering to the structure and order described. The

tasks were not timed during the first run through. The participant was given an explanation

of each task and encouraged to ask questions.

7. The participant completed the nine tasks again, still adhering to the structure and order

described. The tasks were timed during the second run through. The participant activated a

manual timer before undertaking each task, and stopped the timer once the task was

completed. The timer was activated by the participant pushing a key on a keyboard located

on the same surface as the physical object to be arranged. The physical object was

collected by the participant from a consistent location after the timer was activated.

8. The participant completed a survey to gather their opinions of the tasks they had

undertaken.

The participant was able to indicate that they would like a break at any time during the user study.

The study took place within the School of Computer and Information Science at the University of

South Australia.

4.4 Data Gathering

A range of both quantitative and qualitative data was gathered from the user study. Provided are

details of the data gathered, as well as the comparisons that will be made with the data gathered.

4.4.1 Quantitative Data Gathering

With regards to quantitative data, the user study allowed for the gathering of data pertaining to:

the time that the task took to complete, using a manual timer (for all tasks),

the time that the task took to complete, using a SAR timer (for Task 2 to Task 9),

50

Page 61: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

the randomly generated desired final position of the physical object, for all tasks, which

consisted of rotation and translation values, and

the achieved position of the physical object once the task was complete, for all tasks, which

consisted of rotation and translation values.

The quantitative data gathered allows for comparisons between:

the desired final x position and the achieved final x position of the physical object (x axis

accuracy),

the desired final y position and the achieved final y position of the physical object (y axis

accuracy),

the desired final rotation position and the achieved final rotation position of the physical

object (rotation accuracy),

the time taken to complete each task using a manual timer (task completion time), and

the time taken to complete each SAR visual cue task (Task 2 to Task 9) using a SAR timer

(SAR visual cue task completion time).

4.4.2 Qualitative Data Gathering

With regards to qualitative data, the intuitiveness survey undertaken as part of the user study

allowed for the gathering of data pertaining to:

the participant’s attempt to determine the outcome of a diagram depicting a physical object

with a SAR visual cue instructing a translation, according to the translation instruction

described in Section 3.2.1,

the participant’s attempt to determine the outcome of a diagram depicting a physical object

with a SAR visual cue instructing a 45 degree rotation, according to the rotation instruction

described in Section 3.2.2,

the participant’s attempt to determine the outcome of a diagram depicting a physical object

with a SAR visual cue instructing a translation, according to the translation instruction

described in the Circle Cue description in Section 3.2.3, and

the participant’s attempt to determine the outcome of a diagram depicting a physical object

with a SAR visual cue instructing a 45 degree rotation, according to the rotation instruction

described in the Circle Cue description in Section 3.2.3.

Also with regards to qualitative data, the user opinion survey undertaken as part of the user study

allowed for the gathering of data pertaining to:

the participant’s preference between Task 1 (No SAR) and Task 2 (Translation First Cue),

51

Page 62: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

the participant’s preference between Task 2 (Translation First Cue), Task 3 (Rotation First

Cue) and Task 4 (Both Instructions Cue),

the participant’s preference between Task 2 (Translation First Cue) and Task 5 (Circle

Cue),

the participant’s preference between Task 6 (Wireframe Cue) and Task 7 (Wireframe Cue

with No Destination),

the participant’s preference between Task 8 (Square Cue) and Task 9 (Square Cue with No

Destination),

the participant’s preference between the Wireframe tasks (Task 6 and Task 7) and the

Square tasks (Task 8 and Task 9), and

the participant’s preference between the tasks that utilised the translation instruction as

described in Section 3.2.1 and the rotation instruction as described in Section 3.2.2 (Task 2

to Task 4, and Task 6 to Task 9).

Appendix A presents the surveys given to the participants to gather this qualitative data.

4.4.3 Manual Timer and SAR Timer Data Gathering

Data was recorded for all tasks using a manual timer. Data was also recorded with a SAR timer for

all tasks that utilised SAR visual cues. Thus, Task 1 (No SAR) was not timed with the SAR timer

and all other tasks were timed with the SAR timer. The manual timer was required for the

comparison of the task completion times of the tasks that utilise SAR visual cues with Task 1 (No

SAR) which did not utilise SAR visual cues. The SAR timer was utilised as it allows for more

accurate comparisons between the SAR visual cue tasks as less human error is involved.

The manual timer was implemented within the SAR module developed, which is further discussed

in Section 9.3.1. As stated, the manual timer was activated by the user before each task was

undertaken, and stopped once the task was completed. The manual timer was controlled by the user

pushing keys on a keyboard located on the same surface as the physical object to be arranged. To

activate the manual timer, the user pressed the “1” key. To stop the manual timer, the user pressed

the “2” key. Thus, for all tasks during the second run through, the following process was

undertaken:

1. The user presses “1” to activate the timer.

2. The user retrieves the physical object from a consistent location.

3. The user undertakes the arrangement task, leaving the object in an arranged position.

4. The user presses “2” to stop the timer.

52

Page 63: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

The SAR timer was activated within the SAR system when the user brought the physical object

towards the centre of the table, within the range of the camera, allowing the marker to be

recognised by the SAR system. The SAR timer was stopped within the SAR system when the

object had been successfully arranged. If the physical object was moved again, after being

successfully arranged, the timer would resume timing. Once the user had completed the

arrangement task, the SAR timer value was recorded and reset to zero in anticipation for the next

task.

A user may operate the manual timer at varying speeds compared to other users. A user may also

operate the timer at varying speeds for each task that they performed. Utilising a SAR timer

provided decreased opportunity for varying timer values as a result of human error and other

human factors.

As the two timers gathered different data in different manners, it is best to view them standalone,

and not draw comparisons between manual timer results and SAR timer results. It is intended that

the SAR timer results give a more accurate indication of the task time difference between the

different SAR visual cues. It is intended that the manual timer results give an indication of the how

the manual arrangement task time varies compared to the task completion times of the SAR visual

cues, when values are gathered in the same manner.

53

Page 64: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

5 Results

This section presents results gathered from the user study described in Section 4. Information

collected about the participants, including the average age of the participants and their previous

experience with AR and SAR systems is presented in Section 5.1. Quantitative data such as the

manual and SAR timer results and accuracy results are presented in Section 5.2, whereas

qualitative data such as the intuitiveness survey and user opinion survey results are presented in

Section 5.3. A discussion of the results is then provided in Section 5.4, including summation of the

major trends in the results gathered. This discussion also includes clarification of the manual timer

and SAR timer results. Appendix C presents the data gathered to form these results.

5.1 User InformationThe study involved twenty participants, consisting of nineteen males and one female. The average

age of the participants was 24.85, and the standard deviation of this data is 4.475. The participants

were asked to rate their experience level with any type of AR systems. Ten participants (or fifty

percent) of the user study stated they had no experience with AR systems, whilst seven participants

(or thirty-five percent) claimed they had some experience and three participants (or fifteen percent)

claimed they were experienced in using AR systems. Participants were also queried on whether

they had interacted with a SAR system previously. Fourteen participants (or seventy percent) of the

user study stated they had never interacted with a SAR system, whilst four participants (or twenty

percent) claimed they had interacted with a SAR system and two participants (or ten percent) were

unsure.

5.2 Quantitative Data

One of the aims of the user study undertaken was to determine:

the effectiveness, in terms of task time and accuracy, of different SAR visual cues versus a

manual method to instruct the arrangement of a physical object.

54

Page 65: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Task time was determined using a manual timer for all tasks and a SAR timer for all tasks that

utilise SAR visual cues. Accuracy was determined by gathering x axis, y axis and rotation values

for the final position and achieved final positions of the physical object after each arrangement

task.

5.2.1 Task Completion Times

For all tasks performed, values gathered with a manual timer will be presented. In addition, for the

tasks that utilise SAR visual cues, values gathered with a SAR timer will be presented. As Task 1

(No SAR) did not utilise a SAR visual cue, no SAR timer values were gathered for that task. Figure

37 compares task completion times gathered from the manual timer and the SAR timer.

1 2 3 4 5 6 7 8 905

1015202530354045

Average Completion Time of User Study Tasks as Timed with the Manual Timer and the SAR

Timer

Manual Timer

SAR Timer

Task Number

Tim

e (s

econ

ds)

Figure 37 Average Completion Time for all User Study Tasks as Timed with the Manual Timer and

the SAR Timer

When considering the manual timer results, Task 9 (Square Cue with No Destination) took the

shortest to complete, with an average completion time of 17.776 seconds and a standard deviation

of 8.679 seconds. Task 5 (Circle Cue) took the longest to complete, with an average completion

time of 41.154 seconds and a standard deviation of 25.229 seconds. The average completion time

of Task 1 (No SAR) was 32.348 seconds with a standard deviation of 13.773 seconds, making it

the second longest task to complete.

A pair wise T-test of the manual timer value for Task 1 (No SAR) with the best performing SAR

task (Task 9: Square Cue with No Destination) was performed. As a result, a significant effect on

55

Page 66: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

completion time (p < 0.05) can be observed, indicating that Task 9 was significantly faster than

Task 1.

The notable difference with the values from the SAR timer when compared to the values of the

manual timer is that Task 8 (Square Cue) has been completed faster on average than Task 3

(Rotation First), Task 6 (Wireframe Cue) and Task 7 (Wireframe Cue with No Destination). Task 9

(Square Cue with No Destination) remains the fastest completed task. A discussion regarding the

manual timer results and the SAR timer results is present in Section 5.4.1.

5.2.2 Accuracy

For all tasks performed, values gathered regarding the accuracy along each axis; the x axis and the

y axis, and the rotation accuracy will be presented. The difference in the desired final translation

position and the achieved final translation position in terms of each axis were calculated and are

presented in Figure 38.

1 2 3 4 5 6 7 8 90

5

10

15

20

25

30

35

Average x and y Axis Variation for all User Study Tasks

X Axis Variation

Y Axis Variation

Task Number

Aver

age

Varia

tion

(uni

ts)

Figure 38 Average x and y Axis Variation for all User Study Tasks

Task 1 (No SAR) had the least accurate x axis translation: the achieved position diverged from the

desired final position by an average of 16.955 units with standard deviation of 22.708 units. This

was followed by Task 2 (Translation First) which had an average of 5.929 units of variation with a

standard deviation of 4.251. The most accurate task with regards to x axis translation was Task 4

(Both Instructions Cue) which had an average of 3.628 units of variation with a standard deviation

of 2.195.

56

Page 67: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

The average x axis variation of all SAR visual cue tasks is 4.724 units with a standard deviation of

1.142 units, whereas the average x axis translation variation for Task 1 (No SAR) is 16.955 units

with a standard deviation of 22.708 units, a higher variation.

It can be stated with 95% confidence that the translation variation along the x axis when arranging

a physical object using the developed SAR visual cues will occur between 4.223 units and 5.224

units. It can also be stated with 95% confidence that the translation variation along the x axis for

Task 1 (No SAR) will occur within a higher range; between 7.003 units and 26.907 units.

As with the variation with the x axis translation, Task 1 (No SAR) also had the least accurate y axis

translation: the achieved position diverged from the desired final position by an average of 32.981

units with a standard deviation of 11.899 units. This was followed by Task 9 (Square Cue with No

Destination) which had an average of 5.290 units of variation with a standard deviation of 3.991

units. The most accurate task with regards to y axis translation was Task 3 (Rotation First) which

had an average of 4.260 units of variation with a standard deviation of 2.977 units.

The average translation variation along the y axis when arranging a physical object using the SAR

visual cues is 4.776 units with a standard deviation of 1.167 units, whereas the average y axis

translation variation for Task 1 (No SAR) is 32.981 units with a standard deviation of 11.899 units,

a higher variation.

It can be stated with 95% confidence that the translation variation along the y axis when arranging

a physical object using the developed SAR visual cues will occur between 4.265 units and 5.288

units. It can also be stated with 95% confidence that the translation variation along the y axis for

Task 1 (No SAR) will occur within a higher range; between 27.767 units and 38.196 units.

Rotation accuracy is now presented. The differences between the desired final rotation position and

the achieved final rotation position were calculated. Figure 39 shows the average variation of the

rotation for each task.

57

Page 68: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

1 2 3 4 5 6 7 8 90

1

2

3

4

5

6

7

8

Average Rotation Variation of all User Study Tasks

Task Number

Aver

age

Rota

tion

Varia

tion

(deg

rees

)

Figure 39 Average Rotation Variation of all User Study Tasks

Task 1 (No SAR) also had the least accurate rotation: the achieved final position diverged from the

desired final position by an average of 7.467 degrees with a standard deviation of 9.208 degrees.

This was followed by Task 2 (Translation First) which had an average of 6.903 degrees of variation

with a standard deviation of 3.299 degrees. The most accurate task with regards to rotation was

Task 8 (Square Cue) which had an average of 3.319 degrees of variation with a standard deviation

of 2.368 degrees.

The average rotation variation when arranging a physical object using the developed SAR visual

cues is 5.280 degrees with a standard deviation of 0.899 degrees, whereas the average rotation

variation for Task 1 (No SAR) is 7.467 degrees with a standard deviation of 9.208 degrees, a higher

variation.

It can be stated with 95% confidence that the rotation variation when arranging a physical object

using the developed SAR visual cues will occur between 4.886 degrees and 5.675 degrees. It can

also be stated with 95% confidence that the rotation variation for Task 1 (No SAR) will occur

between 3.432 degrees and 11.502 degrees. The confidence values indicate that the variation for

Task 1 (No SAR) is likely to occur within a higher range; there may be many occurrences within

the range of 5.675 degrees and 11.502 degrees which is higher than the range indicated regarding

the SAR visual cues.

A pair wise T-test of the rotation variation for Task 1 (No SAR) with the SAR task with the most

accurate rotation (Task 8: Square Cue) was performed. As a result, a significant effect on rotation

58

Page 69: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

accuracy (p < 0.05) can be observed, indicating that Task 8 was significantly more accurate than

Task 1.

5.3 Qualitative Data

The remaining aims of the user study undertaken were to determine:

the intuitiveness of several different SAR visual cues in SAR, and

user opinion of the tasks undertaken.

The intuitiveness of several different SAR visual cues was determined by a written activity

preceding the arrangement tasks, and the user’s opinion of the tasks undertaken was gathered upon

the participant’s completion of the user study.

5.3.1 Intuitiveness Survey

Results of the intuitiveness of the Translation Instruction described in Section 3.2.1 are presented

first.

Figure 40 Translation Instruction Intuitiveness Diagram

Participants were presented with written instructions and the intuitiveness diagram depicted in

Figure 40 as part of the survey presented in Appendix A. Sixteen participants (or eighty percent)

correctly identified the translation instruction being performed upon the physical object. The

remaining four participants (or twenty percent) were incorrect. Three of the participants who

answered incorrectly stated that the SAR visual cue was instructing both a rotation and a

59

Page 70: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

translation. The other participant who answered incorrectly stated that the SAR visual cue was

instructing a translation, but to an incorrect location.

Results of Intuitiveness Study Regarding the Translaton Instruction

CorrectIncorrect

Figure 41 Results of the Intuitiveness Survey Regarding the Translation Instruction

Figure 41 shows a graphical depiction of the number of participants who were correct and the

number of participants who were incorrect when undertaking this intuitiveness task regarding the

Translation Instruction.

Results of the intuitiveness of the Rotation Instruction described in Section 3.2.2 are now

presented.

Figure 42 Rotation Instruction Intuitiveness Diagram

60

Page 71: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Participants were presented with written instructions and the intuitiveness diagram depicted in

Figure 42 as part of the survey presented in Appendix A. Fourteen participants (or seventy percent)

correctly identified the rotation instruction being performed upon the physical object. The

remaining six participants (or thirty percent) were incorrect. Three of the participants who

answered incorrectly thought the SAR visual cue was instructing both a rotation and a translation –

one of these participants had the correct rotation but had translated the object also. Two of the

participants who answered incorrectly stated that the SAR visual cue was instructing a rotation of

forty-five degrees, but in the incorrect direction. The other participant who answered incorrectly

stated that the SAR visual cue was instructing a translation.

Results of Intuitiveness Study Regarding the Rotation Instruction

CorrectIncorrect, but indicated a translationIncorrect

Figure 43 Results of the Intuitiveness Survey Regarding the Rotation Instruction

Figure 43 shows a graphical depiction of the number of participants who were correct and the

number of participants who were incorrect when undertaking this intuitiveness task regarding the

Rotation Instruction.

In addition to gathering data on the intuitiveness of the translation instruction described in Section

3.2.1 and the rotation instruction described in Section 3.2.2, data regarding the intuitiveness of the

modified translation and rotation instructions utilised for the Circle Cue was also gathered. Results

of the intuitiveness of the Circle Cue translation instruction described in Section 3.2.3 are now

presented.

61

Page 72: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 44 Circle Cue Translation Instruction Intuitiveness Diagram

Participants were presented with written instructions and the intuitiveness diagram depicted in

Figure 44 as part of the survey presented in Appendix A. Eighteen participants (or ninety percent)

correctly identified the translation instruction being performed upon the physical object. The

remaining two participants (or ten percent) were incorrect. The remaining two participants who

answered incorrectly thought the SAR visual cue was instructing a rotation. Both participants

indicated the same rotation whereby the physical object would be rotated so that one of the ends of

it would point in the direction of the circle not projected upon the object.

Results of Intuitiveness Study Regarding the Circle Cue Translation Instruction

CorrectIncorrect

Figure 45 Results of the Intuitiveness Survey Regarding the Circle Cue Translation

62

Page 73: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 45 shows a graphical depiction of the number of participants who were correct and the

number of participants who were incorrect when undertaking this intuitiveness task regarding the

Circle Cue translation instruction.

Results of the intuitiveness of the Circle Cue rotation instruction described in Section 3.2.3 are now

presented.

Figure 46 Circle Cue Rotation Instruction Intuitiveness Diagram

Participants were presented with written instructions and the intuitiveness diagram depicted in

Figure 46 as part of the survey presented in Appendix A. One participant (or five percent) correctly

identified the rotation instruction being performed upon the physical object. The remaining

nineteen participants (or ninety-five percent) were incorrect. Of the nineteen participants who

answered incorrectly, one thought that the SAR visual cue was instructing a rotation, but of 90

degrees, and the remaining participants thought that the SAR visual cue was instructing no

movement, and thus the physical object should remain in an unchanged position.

63

Page 74: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Results of Intuitiveness Study Regarding the Circle Cue Rotation Instruction

CorrectIncorrectIncorrect, but indicated a ro-tation

Figure 47 Results of the Intuitiveness Survey Regarding the Circle Cue Rotation

Figure 47 shows a graphical depiction of the number of participants who were correct and the

number of participants who were incorrect when undertaking this intuitiveness task regarding the

Circle Cue rotation instruction.

5.3.2 User Opinion Survey

Upon completion of the tasks, participants were asked to complete a series of questions gauging

their opinions of the different SAR visual cues in the tasks undertaken; results of this survey are

now presented.

When asked which of the tasks was better, out of Task 1 (No SAR) and Task 2 (Translation First),

nineteen participants (or ninety-five percent) claimed to prefer Task 2. Figure 48 shows a graphical

depiction of the preferences of the participants.

64

Page 75: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

User Preference Between a Manual Ob-ject Arrangement and an Object Ar-

rangement Using SARPreference for Task 1 (No SAR)Preference for Task 2 (Trans-lation First)

Figure 48 User Preference Between a Manual Object Arrangement and an Object Arrangement Using

SAR

When asked which of the tasks was better, out of Task 2 (Translation First), Task 3 (Rotation First)

and Task 4 (Both Instructions Cue), four participants (or twenty percent) claimed to prefer Task 2,

one participant (or five percent) claimed to prefer Task 3, and fourteen participants (or seventy

percent) claimed to prefer Task 4. The remaining participant had no preference, claiming that Task

2 and Task 3 allowed the user to be stepped through the object arrangement process if there is

unfamiliarity with the arrangement process, but once familiarity with the arrangement process is

gained, the tasks can be completed faster when rotation and translation instructions are both visible

as in Task 4. Figure 49 shows a graphical depiction of the preferences of the participants.

User Preference Between Translation First, Rotation First and Both Instructions

Cue Preference for Task 2 (Trans-lation First)Preference for Task 3 (Ro-tation First)Preference for Task 4 (Both Instructions Cue)No Preference

Figure 49 User Preference Between Translation First, Rotation First and Both Instructions Cue

65

Page 76: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

When asked which of the tasks was better, out of Task 2 (Translation First) and Task 5 (Circle

Cue), nineteen participants (or ninety-five percent) claimed to prefer Task 2. The remaining

participant had no preference, and claimed that Task 2 and Task 5 were as good as each other.

Figure 50 shows a graphical depiction of the preferences of the participants.

User Preference Between Translation First and Circle Cue

Preference for Task 2 (Trans-lation First)Preference for Task 5 (Circle Cue)No Preference

Figure 50 User Preference between Translation First and Circle Cue

When asked which of the tasks was better, out of Task 6 (Wireframe Cue) and Task 7 (Wireframe

Cue with No Destination), twelve participants (or sixty percent) claimed to prefer Task 6 and five

participants (or twenty-five percent) claimed to prefer Task 7. The remaining three participants had

no preference. One of the participants who claimed that they had no preference stated that the

wireframe was difficult to see in both tasks and this was a factor in the participant having no

preference. Another of the participants who claimed that they had no preference stated that they did

not utilise the wireframe during the arrangement process. Rather, this participant claimed that they

used only the translation instruction as described in Section 3.2.1 and rotation instruction as

described in Section 3.2.2 that were also projected upon the physical object. Figure 51 shows a

graphical depiction of the preferences of the participants.

66

Page 77: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

User Preference Between Wireframe Cue and Wireframe Cue with No Destination

Preference for Task 6 (Wire-frame Cue)Preference for Task 7 (Wire-frame Cue with No Destina-tion)No Preference

Figure 51 User Preference Between Wireframe Cue and Wireframe Cue with No Destination

When asked which of the tasks was better, out of Task 8 (Square Cue) and Task 9 (Square Cue

with No Destination), eighteen participants claimed to prefer Task 8, and one participant claimed to

prefer Task 9. The remaining participant had no preference. Figure 52 shows a graphical depiction

of the preferences of the participants.

User Preference Between Square Cue (Task 8) and Square Cue with No Des-

tination (Task 9)Preference for Task 8 (Square Cue)Preference for Task 9 (Square Cue with No Destination)No Preference

Figure 52 User Preference Between Square Cue (Task 8) and Square Cue with No Destination (Task 9)

The participants were also asked which of the groups of tasks they preferred out of the tasks that

utilised a wireframe projection (Task 6 and Task 7) and the tasks that utilised a square projection

(Task 8 and Task 9). Fifteen of the participants claimed to prefer the tasks that utilised a square

projection and five of the participants claimed to prefer the tasks that utilised the wireframe

projection. Figure 53 shows a graphical depiction of the preferences of the participants.

67

Page 78: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

User Preference Between Square Cues (Task 8 and Task 9) and Wireframe Cues

(Task 6 and Task 7)Preference for the Wireframe Cues (Task 6 and Task 7)Preference for the Square Cues (Task 8 and Task 9)

Figure 53 User Preference Between Square Cues (Task 8 and Task 9) and Wireframe Cues (Task 6

and Task 7)

Finally, the users were asked for their preference of the tasks that utilised the translation instruction

as described in Section 3.2.1 and the rotation instruction as described in Section 3.2.2. Thus, Task 1

(No SAR) and Task 5 (Circle Cue) were omitted from this question. This question determined that:

one participant (or five percent) claimed to prefer Task 2 (Translation First),

no participants (or zero percent) claimed to prefer Task 3 (Rotation First),

six participants (or thirty percent) claimed to prefer Task 4 (Both Instructions Cue),

two participants (or ten percent) claimed to prefer Task 6 (Wireframe Cue),

no participants (or zero percent) claimed to prefer Task 7 (Wireframe Cue with No

Destination),

seven participants (or thirty-five percent) claimed to prefer Task 8 (Square Cue), and

no participants (or zero percent) claimed to prefer Task 9 (Square Cue with No

Destination).

The remaining four participants indicated that they had no clear preference of the SAR visual cues

encountered in the tasks undertaken. One of the remaining participants claimed that they preferred

both Task 4 (Both Instructions Cue) and Task 6 (Wireframe Cue) as they both allowed for natural

movement. Another of the remaining participants had a similar claim. They claimed that they

would have preferred a combination of Task 4 (Both Instructions Cue) and Task 7 (Wireframe Cue

68

Page 79: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

with No Destination) as they felt it would provide a generic approach that is applicable in a range

of situations.

2 3 4 6 7 8 902468

1012

User Preference of Visual Cues that Util-ise the Translation and Rotation In-

structionsPreferences for the Visual Cue as Part of a Com-bination

Preference for the Visual Cue

Task Number

Num

ber o

f Pre

fere

nces

Figure 54 User Preference of Visual Cues that Utilise the Translation and Rotation Instructions

Another of the remaining participants claimed that they would have preferred a combination of

Task 4 (Both Instructions Cue) and Task 8 (Square Cue). Another of the remaining participants had

a similar claim. They claimed that they preferred Task 4 (Both Instructions Cue) as they felt it

would be applicable to a wider variety of situations, such as the arrangement of round physical

objects; however they also preferred Task 8 (Square Cue) in the tasks undertaken during the user

study. Figure 54 shows a graphical depiction of the preferences of the participants.

5.4 Discussion

When considering the manual timer results, Task 9 (Square Cue with No Destination) was the

quickest task completed. Since Task 9 is also the last task undertaken, it can be suggested that this

quick task time is attributed to a learning effect. However, the manual timer results of the

remainder of the tasks do not appear to conform to this trend and there is also no trend present

regarding the accuracy values that suggests a learning effect.

When further considering the manual timer results, all of the tasks that utilised SAR visual cues

were quicker than Task 1 (No SAR) except for the Task 5 (Circle Cue). Unlike the remainder of the

SAR visual cues, the Circle Cue implements variations of the Translation and Rotation Instructions

described in Section 3.2.1 and Section 3.2.2. Thus, it can be stated that all of the cues that utilised

the Translation and Rotation Instructions as described in Section 3.2.1 and Section 3.2.2 where

quicker than Task 1 (No SAR). In addition, a pair wise t-test determined that Task 9 (Square Cue

with No Destination) was significantly faster than Task 1. 69

Page 80: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

In addition, Task 1 (No SAR) also resulted in the least accurate physical object arrangements. This

inaccuracy was present when considering all accuracy data gathered: the x axis accuracy, y axis

accuracy and the rotation accuracy.

When considering the intuitiveness of the Translation Instruction described in Section 3.2.1 and the

Rotation Instruction described in Section 3.2.2, the majority of the participants were able to

correctly identify both of the instructions when presented with a diagram of each. However, when

considering the intuitiveness of the translation and rotation instructions that were utilised for the

Circle Cue, the majority of the participants could identify the translation when presented with a

diagram, but only one participant was able to identify the rotation when presented with a diagram.

When considering the results of the user opinion survey undertaken, Task 2 (Translation First Cue)

was rated favourably when compared with tasks in which the arrangement method varied, such as

the manual arrangement in Task 1 (No SAR) and Task 5 (Circle Cue) as the Circle Cue

implemented a differing translation and rotation than the rest of the SAR visual cues. However,

Task 4 (Both Instructions Cue) was preferred over Task 2 (Translation First Cue). The SAR visual

cues which utilised square projections were also rated favourably, and out of these, Task 8 (Square

Cue) was preferred.

In summation, the SAR visual cues that implement the Translation Instruction and Rotation

Instruction provide a quicker arrangement than a manual arrangement method, and the SAR visual

cues provide a more accurate arrangement than a manual arrangement method. The Translation

Instruction and Rotation Instruction were more intuitive than variations of these that were utilised.

The Both Instructions Cue and Square Cue were considered favourably. From these results, a

recommendation for either the Both Instructions Cue or the Square Cue can be made. In addition to

receiving a favourable user opinion, they are consistent with the advantages found with the results;

they are both SAR visual cues that implement the Translation and Rotation Instructions and thus

are quicker and more accurate than a manual arrangement, as well as being more intuitive. The

hypothesis of the user study is validated as there is an improvement in speed and accuracy when

comparing the SAR visual cues with the manual timer method.

5.4.1 Notes on the Manual Timer and SAR Timer Results

Data was gathered from both a manual timer and a SAR timer. The gathering of this data was

described in Section 4.4.3. The manual timer results were higher than the corresponding SAR timer

results gathered. To explain this, the steps of undertaking a task in the user study are revisited:

1. The user presses “1” to activate the timer.

2. The user retrieves the physical object from a consistent location.

70

Page 81: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

3. The user undertakes the arrangement task, leaving the object in an arranged position.

4. The user presses “2” to stop the timer.

With regards to this process, the manual timer is activated when the user presses “1”, but the SAR

timer is activated later, when the user brings the physical object to the centre of the table. Similarly,

with regards to this process, the manual timer is stopped when the user presses “2”, but the SAR

timer is stopped earlier, when the physical object is successfully arranged. As manual timer is

activated earlier and stopped later, it is evident that this is a reason why the manual timer values are

larger.

However, the manual timer values are not consistently larger. For example, when considering the

SAR timer values, Task 8 (Square Cue) is performed quicker than Task 3 (Rotation First Cue),

Task 6 (Wireframe Cue) and Task 7 (Wireframe Cue with No Destination). However, when

considering the manual timer values for these cues, Task 8 (Square Cue) is performed slower than

the other specified tasks. This is attributed to human factors. As stated in Section 4.4.3, a user may

operate the timer at varying speeds for each task that they performed.

To reiterate, as the two timers gathered different data in different manners, it is best to view them

standalone, and not draw comparisons between manual timer results and SAR timer results. It is

intended that the SAR timer results give a more accurate indication of the task time difference

between the different SAR visual cues. It is intended that the manual timer results give an

indication of the how the manual arrangement task time varies compared to the task completion

times of the SAR visual cues when values are gathered in the same manner.

71

Page 82: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

6 Revised Visual Cue

A revised visual cue was developed with consideration to the results presented in Section 5.

Analysis of the user study results indicated several trends which were taken into account when

developing the revised visual cue. The analysis performed is detailed in Section 6.1. Details of the

methodology and implementation undertaken to develop the Revised Visual Cue are presented in

Section 6.2. A discussion of the Revised Visual Cue is presented in 6.3, including reasons for its

particular implementation. A pilot study was then undertaken to gather preliminary results

regarding the revised SAR visual cue. A description of this pilot study and the results gathered are

presented in Section 7.

6.1 Analysis

A revised SAR visual cue was developed by considering the:

manual timer results and the SAR timer results: Task 9 (Square Cue with No Destination)

was the quickest completed task when assessed with both the manual timer and the SAR

timer. Task 8 (Square Cue) was the second quickest completed task when assessed with the

SAR timer. All SAR visual cues that utilised the Translation Instruction and Rotation

Instruction were completed quicker than Task 1 (No SAR),

accuracy results: Task 4 (Both Instructions Cue) had the most accurate x axis translation.

Task 3 (Rotation First Cue) had the most accurate y axis translation. Task 8 (Square Cue)

was completed with the most accurate rotation,

intuitiveness results: The translation instruction described in Section 3.2.1 was more

intuitive than the Circle Cue translation as described in Section 3.2.3. Similarly, the

rotation instruction described in Section 3.2.2 was more intuitive than the Circle Cue

rotation as described in Section 3.2.3, and

72

Page 83: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

user opinion survey results: SAR visual cues are preferred over a manual arrangement.

Participants preferred Task 4 (Both Instructions Cue) over SAR visual cues that displayed

either a translation instruction or rotation instruction at any one time. Participants preferred

the SAR visual cues which utilised the Translation Instruction as described in Section 3.2.1

and the Rotation Instruction as described in Section 3.2.2, over the Circle Cue which

utilised variations of those instructions as described in Section 3.2.3. There was a

preference for SAR visual cues that displayed a projection upon the physical object and the

same projection at the destination position. There was a seventy five percent preference for

the SAR visual cues that utilised the Square projection over SAR visual cues that utilised

the wireframe projections. Task 4 (Both Instructions Cue) and Task 8 (Square Cue) utilised

the most preferred visual and second most preferred SAR visual cue.

Thus, it was determined that the revised SAR visual cue should build upon elements of the Square

Cue, the Both Instructions Cue, the Translation Instruction as described in Section 3.2.1 and the

Rotation Instruction as described in Section 3.2.2.

6.2 Methodology and ImplementationThe Translation Instruction is utilised along with a variation of the Rotation Instruction and two

projections of a revised shape. One shape projection is aligned with the centre of the top of the

physical object and follows the physical object as it is moved. The other shape projection is present

at the desired final position. The SAR visual cue follows the process described by Figure 6 with the

Translation Instruction occurring first.

73

Page 84: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 55 Revised Visual Cue: Translation

Figure 55 depicts the initial state of the physical object. The implementation of this initial state is

presented in Figure 56.

Figure 56 Revised Visual Cue: Translation Implementation

As with the Translation First Cue described in Section 3.2.3.1, the Translation Instruction is

displayed and, in addition, the revised shape projections are displayed. The Translation Instruction

74

Page 85: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

will guide the user to move the physical object to a position that corresponds with the desired final

translation. In the scenario presented by Figure 56, the user must translate the object in a south

westerly direction, so that the circles are aligned. This can be achieved by following the line

projections. It can also be achieved by aligning the two shape projections. Once the user has moved

the physical object so that the translation is correct, the Translation Instruction ceases to be

displayed.

Figure 57 depicts, from a top down view, the process of undertaking a translation using the Revised

Visual Cue. The tracked physical object is represented by a solid shape whilst the desired final

position of the object is represented by a dotted shape.

Figure 57 Revised Visual Cue: Translation

Upon completion of the scenario depicted in Figure 57, the tracked physical object would be

successfully arranged as the revised shape is aligned and thus both the translation and rotation have

been completed successfully. However, if the user has arranged the tracked physical object at a

position with a correct translation but an incorrect rotation, a rotation instruction is displayed.

75

Page 86: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 58 Revised Visual Cue: Rotation

Figure 58 depicts the state of the physical object once the initial translation has been completed and

rotation is now being performed. The implementation of this state is presented in Figure 59.

Figure 59 Revised Visual Cue: Rotation Implementation

A variation of the Rotation Instruction is now displayed to prompt the user to rotate the physical

object; an arrow is displayed in a similar distance from the centre of the object as the circles

76

Page 87: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

displayed for the Rotation Instruction described in Section 3.2.2. The arrow indicates the direction

that the user must rotate the physical object in order to achieve a position that corresponds with the

desired final rotation. The current rotation follows the x axis of the tracked physical object’s local

coordinate system. If it is imagined that the destination rotation is at zero, then when the object is

rotated to any degree between 0 and 180, the arrow will point left towards zero. If the object is

rotated past 180, to 270 degrees, for example, the arrow will instead be inverted, in which case it

will point right towards zero.

In the scenario presented by Figure 59, the user must rotate the object clockwise, as indicated by

the arrow. Once the user has moved the physical object to a position that corresponds with the

desired final translation and rotation, the SAR visual cue will no longer be displayed, indicating

that the physical object has been successfully arranged.

Figure 60 Revised Visual Cue: Rotation

Figure 60 depicts the process of undertaking a rotation using the Revised Visual Cue. The tracked

physical object is represented by a solid shape whilst the desired final position of the object is

represented by a dotted shape.

6.3 DiscussionA square has four axes of symmetry, and as a result, the Square Cue communicated rotation

information poorly as when the user attempted to align the square shapes, the rotation of the

squares was ambiguous. As the revised shape utilised in the revised visual has only one axis of

symmetry, rotation information can be communicated in a less ambiguous manner.

For the shape to be successfully arranged the user is required to align the revised shape and there is

only one position in which this alignment can be achieved correctly, which can be verified visually.

Although the desired final rotation can be gathered from the revised shape, the arrow is displayed if

the user arranges the object in a way such that the translation is complete but the rotation is

77

Page 88: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

incomplete. In this scenario, the user has moved the object so that the revised shapes are

overlapping but not aligned, the display of the arrow aids the user in rotating the object if the

desired final rotation has not been achieved.

78

Page 89: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

7 Pilot Study

This section presents details of the pilot study undertaken to gather preliminary results on the

effectiveness of the Revised Visual Cue developed. Results gathered from the pilot study are then

presented. The pilot study was undertaken in a similar manner to the user study performed. The

pilot study consisted of six participants. All six participants had previously participated in the user

study. The pilot study endeavoured to determine preliminary results regarding:

the effectiveness, in terms of task time and accuracy, of the Revised Visual Cue,

the intuitiveness of the Revised Visual Cue, and

the user opinion of the task undertaken.

Data was gathered using the methods utilised for the user study which are described in Section 4.4.

The pilot study configuration resembles the configuration of the user study which is defined in

Section 4.1. The only task undertaken was a physical object arrangement task utilising the Revised

Visual Cue. The structure of the pilot study is outlined in Section 7.1 and the preliminary results

gathered are presented in Section 7.2.

7.1 StructureThe pilot study was taken with one participant at a time. Each participant undertook the following

procedure:

1. The participant completed a survey to gather information on the intuitiveness of the

Revised Visual Cue

2. The participant completed a task resembling the structure of the tasks undertaken during

the user study. The task was to arrange a tracked physical object utilising the Revised

Visual Cue. The task was not timed during the first run through.

3. The participant completed the task again, which followed the aforementioned structure,

however the task was timed. The participant activated a manual timer before undertaking

the task, and stopped the timer once the task was completed. The timer was activated by 79

Page 90: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

the user pushing a key on a keyboard located on the same surface as the physical object to

be arranged. The physical object was collected by the user from a consistent location after

the timer was activated.

4. The participant completed a survey to gather their opinions of the tasks they had

undertaken.

Appendix B presents the surveys given to the participants to gather the qualitative data.

7.2 Preliminary Results

This section presents the results gathered from the pilot study. Information specific to the

participants is presented in Section 7.2.1. Quantitative data such as the manual and SAR timer

results are presented in Section 7.2.2, whereas qualitative data such as the intuitiveness survey and

user opinion survey results are presented in Section 7.2.3. A discussion of the results is then

presented in Section 7.2.4.

The pilot study data is shown on the graphs alongside the data gathered from the user study tasks,

to provide an indication of the effectiveness of the Revised Visual Cue. The Revised Visual Cue

task is represented on graphs as Task 10.

7.2.1 User Information

The pilot study involved six participants, consisting of six males and zero females. The average age

of the participants was 24 with a standard deviation of 3.521. All participants had previously

participated in the user study, thus they all had experience with the SAR system utilised. Appendix

C presents the data gathered to form these preliminary results.

7.2.2 Quantitative Data

One of the aims of the pilot study was to determine:

the effectiveness, in terms of task time and accuracy, of the Revised Visual Cue.

This was achieved in the same manner as the user study undertaken. Task time was determined

using a manual timer and a SAR timer. Accuracy was determined by gathering x axis, y axis and

rotation values for the desired final position and achieved final positions of the physical object after

the task.

80

Page 91: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

7.2.2.1 Task Completion Times

For the task undertaken, values gathered with a manual timer and values gathered with a SAR timer

will be presented. Figure 61 compares the values gathered from the manual timer and the SAR

timer for the Revised Visual Cue with the values gathered from the manual timer and the SAR

timer for the tasks undertaken in the user study. The Revised Visual Cue is represented as Task 10.

1 2 3 4 5 6 7 8 9 1005

101520253035404550

Average Completion Time of all User Study Tasks and the Revised Visual Cue (Manual

Timer and SAR Timer)

Manual Timer

SAR Timer

Task Number

Tim

e (s

econ

ds)

Figure 61 Average Completion Time of all User Study Tasks and the Revised Visual Cue (Manual

Timer and SAR Timer)

Task 10 (Revised Visual Cue) took the shortest to complete, with an average completion time of

12.570 seconds with a standard deviation of 4.321 seconds. According to the manual timer, Task

10 (Revised Visual Cue) was completed 5.205 seconds faster than Task 9 (Square Cue with No

Destination), which was the task with the shortest completion time in the user study.

With regards to the values gathered with a manual timer, it can be stated with 95% confidence that

the task completion time when arranging a physical object using the Revised Visual Cue will occur

between 9.112 seconds and 16.027 seconds. From the results gathered from the user study, it can

also be stated with 95% confidence that the task completion time for Task 1 (No SAR) will occur

within a higher range; between 26.312 seconds and 38.384 seconds.

Task 10 (Revised Visual Cue) also took the shortest to complete when timed with the SAR timer.

An average completion time of 5.597 seconds with a standard deviation of 2.939 seconds was

recorded. According to the SAR timer, Task 10 (Revised Visual Cue) was completed 7.042

81

Page 92: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

seconds faster than Task 9 (Square Cue with No Destination), which was the task with the shortest

completion time in the user study.

7.2.2.2 Accuracy

For the task undertaken, results regarding the translation accuracy along each axis; the x axis and

the y axis, and the rotation accuracy will be presented.

As with the user study, the difference in the desired final translation and the achieved final

translation in terms of each axis were calculated. Figure 62 compares the average variation of the

translation along the x axis and y axis for the Revised Visual Cue with the values gathered for the

tasks undertaken in the user study. The Revised Visual Cue is represented as Task 10.

1 2 3 4 5 6 7 8 9 100

5

10

15

20

25

30

35

Average x Axis and y Axis Variation of all User Study Tasks and the Revised Visual Cue

x Axis Variationy Axis Variation

Task Number

Aver

age

x Ax

is Va

riatio

n (u

nits

)

Figure 62 Average x Axis and y Axis Variation of all User Study Tasks and the Revised Visual Cue

Task 10 (Revised Visual Cue) had the most accurate x axis translation: the achieved final

translation diverged from the desired final translation by an average of 2.770 units with a standard

deviation of 2.538 units. The most accurate task undertaken in the user study regarding the x axis

variation was Task 4 (Both Instructions Cue), which had an average x axis variation of 3.628 units.

Thus, Task 10 (Revised Visual Cue) obtains an x axis translation that is, on average, 0.858 units

more accurate than Task 4 (Both Instructions Cue).

It can be stated with 95% confidence that the translation variation along the x axis when arranging

a physical object using the Revised Visual Cue will occur between 0.739 units and 4.801 units.

From the results gathered from the user study, it can also be stated with 95% confidence that the

82

Page 93: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

translation variation along the x axis for Task 1 (No SAR) will occur within a higher range;

between 7.003 units and 26.907 units.

Task 10 (Revised Visual Cue) also had the most accurate y axis translation: the achieved final

translation diverged from the desired final translation by an average of 1.839 units with a standard

deviation of 1.651 units. The most accurate task undertaken in the user study regarding the y axis

variation was Task 3 (Rotation First Cue), which had an average y axis variation of 4.260 units.

Thus, Task 10 (Revised Visual Cue) obtains a y axis translation that is, on average, 2.420 units

more accurate than Task 3 (Rotation First Cue).

It can be stated with 95% confidence that the translation variation along the y axis when arranging

a physical object using the Revised Visual Cue will occur between 0.519 units and 3.160 units.

From the results gathered from the user study, it can also be stated with 95% confidence that the

translation variation along the y axis for Task 1 (No SAR) will occur within a higher range;

between 27.767 units and 38.196 units.

Results regarding the rotation accuracy are now presented. As with the user study, the difference in

the desired final rotation and the achieved final rotation were calculated. Figure 63 compares the

average variation of the rotation for the Revised Visual Cue with the values fathered for the tasks

undertaken in the user study. The standard deviation of this data is 9.076. The Revised Visual Cue

is represented as Task 10. The standard deviation of this data is 1.253.

1 2 3 4 5 6 7 8 9 100

1

2

3

4

5

6

7

8

Average Rotation Variation of all User Study Tasks and the Revised Visual Cue

Task Number

Aver

age

Rota

tion

Varia

tion

(deg

rees

)

Figure 63 Average Rotation Variation of all User Study Tasks and the Revised Visual Cue

83

Page 94: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Task 10 (Revised Visual Cue) had the fourth most accurate rotation: the achieved position diverged

from the desired final rotation by an average of 4.835 degrees with a standard deviation of 1.079

units. Thus, Task 10 (Revised Visual Cue) obtains a rotation that is, on average, more accurate than

six of the object arrangement tasks undertaken. The most accurate task undertaken in the user study

regarding the rotation variation was Task 8 (Square Cue), which had an average rotation variation

of 3.319 degrees. Thus, Task 10 (Revised Visual Cue) obtains a rotation that is, on average, 1.516

degrees less accurate than Task 8 (Square Cue).

It can be stated with 95% confidence that the rotation variation when arranging a physical object

using the Revised Visual Cue will occur between 3.971 units and 5.698 units. It can also be stated

with 95% confidence that the rotation variation for Task 1 (No SAR) will occur within 3.431 units

and 11.502 units. The confidence values indicate that the rotation variation for Task 1 (No SAR) is

likely to occur within a higher range; there may be many occurrences within the range of 5.698

degrees and 11.502 degrees which is higher than the range indicated regarding the Revised Visual

Cue.

7.2.3 Qualitative Data

The remaining aims of the pilot study are to determine:

the intuitiveness of the Revised Visual Cue, and

user opinion of the task undertaken.

This was achieved in the same manner as the user study undertaken.

7.2.3.1 Intuitiveness Survey

An intuitiveness survey was undertaken before the Revised Visual Cue task. The survey utilised is

included in Appendix B.

84

Page 95: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 64 Revised Visual Cue Translation Instruction Intuitiveness Diagram

When presented with the diagram depicted in Figure 64, six participants (or one hundred percent)

correctly identified the translation instruction being performed upon the physical object utilising

the Revised Visual Cue.

Figure 65 Revised Visual Cue Rotation Instruction Intuitiveness Diagram

Similarly, when presented with the diagram depicted in Figure 65, six participants (or one hundred

percent) correctly identified the rotation instruction being performed upon the physical object

utilising the Revised Visual Cue.

85

Page 96: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

7.2.3.2 User Opinion Survey

Upon completion of the task, participants were asked to complete a series of questions gauging

their opinions of the Revised Visual Cue in the task undertaken. Participants were asked what was

better about the Revised Visual Cue when compared with the SAR visual cues experienced in the

user study. Two participants commented that the rotation was easier to achieve, and another

participant claimed that the arrow used was easier to understand than the Rotation Instruction

utilised previously. Participants were also asked what was worse about the Revised Visual Cue

compared with the SAR visual cues experienced in the user study. Three participants claimed that

nothing was worse, one participant claimed to have experienced confusion over which projection

was the projection for the current position. The comment was also made that, as the revised shape

was more complex than the square used previously, it caused some confusion when both of the

revised shape projections were aligned. When asked whether the Revised Visual Cue was preferred

over the SAR visual cues encountered in the user study, six participants (or one hundred percent)

stated that it is.

7.2.4 Discussion

The preliminary results from the pilot study have been presented alongside the user study results.

This allows for comparison between the two data sets. The preliminary results show promising

indications that the Revised Visual Cue allows for a faster physical object arrangement when

compared to a manual arrangement and the other SAR visual cues developed. In addition, the

preliminary results also show promising indications that the Revised Visual Cue allows for a more

accurate physical object arrangement along the x and y axes when compared to a manual

arrangement and the other SAR visual cues developed. Preliminary results regarding the rotation

accuracy are also promising in comparison to a manual arrangement; however the rotation accuracy

results gathered for the Revised Visual Cue are not as accurate as those gathered for the Square

Cue from the user study. A participant stated on the user opinion survey that the revised shape used

for the Revised Visual Cue was complex and therefore caused confusion when the projection at the

current position was overlapped with the projection at the desired final position; this may have

caused the rotation to be performed better with the Square Cue as a simpler shape is utilised. In

addition, preliminary results regarding the intuitiveness and user opinion of the Revised Visual Cue

are also positive. Thus, it appears that the Revised Visual Cue developed appears to be heading

toward an improvement regarding SAR object arrangement.

86

Page 97: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

8 Large Object Arrangement

This section provides discussion of a scenario in which a modified SAR visual cue is utilised for

the instructed arrangement of a large physical object. The research described in this dissertation

until this point has addressed the arrangement of objects that can be easily arranged by hand and

that can be effectively projected upon. When considering the arrangement of large physical objects

using SAR visual cues, the scenario differs from that addressed previously in this dissertation. For

example, the physical object utilised for the user study had a maximum dimension of

approximately two hundred and fifty millimetres and was arranged upon a tabletop. Projections

were made upon the top of the object as the user was able to view these projections easily. If a

projection is made upon the top of a large physical object, it may be that the user is not able to view

it as they cannot see the top of the physical object due to its height.

87

Page 98: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 66 SAR Visual Cue Distortion

To present another example, in the scenario where a SAR visual cue projection was to exceed the

surface area of the top of a physical object, portions of the projection would instead be projected

upon the surrounding surface. This is depicted in Figure 66, where the physical object utilised in

the user study is being projected upon. This issue is present with the arrangement of smaller

physical objects, addressed in Section 3.2.4.2, but is magnified when arranging large objects as the

distance to the ground plane may be further, resulting in the portion of the projection being

projected at a further distance from the projection upon the physical object, which may result in the

SAR visual cue being considerably less clear. This problem may also be undesirable in the

arrangement of smaller physical objects, and thus the adapted SAR visual cue for large object

arrangement may also be desirable for use with smaller physical objects.

In order to address these differences in the task of arranging large objects with SAR visual cues,

the Revised Visual Cue described in Section 6 was adapted to develop a new arrangement

technique. The developed SAR visual cue utilises the shape comprised of a square and semi-circle

that was utilised for the Revised Visual Cue. Four of these shapes are projected around the current

position of the physical object and around the desired final position; the user must align the shapes

in order to successfully arrange the large physical object. With the projection at the current position

of the physical object, four arrows are projected; one adjacent to each of the shapes, which indicate

88

Page 99: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

the direction that the large physical object must be rotated. A projected line presents a path from

the centre point of the current position to the centre point of the desired final position, to aid the

user if the end point is not visible. As the user moves the physical object, the projection updates,

following the physical object.

The shape utilised has only one axis of symmetry; if a square projection was utilised, ambiguity

would be present as the rotation could be incorrect by 90 degrees. The rotation of the desired final

position shown in Figure 66 is currently at 45 degrees, if a square were used the projection of

shapes shown would appear the same at 45 degrees and 135 degrees, for example; making the

projection at the desired final position ambiguous. The presence of the arrows would aid in

correcting this issue, however utilising the shapes with only one axis of symmetry removes the

ambiguity in this situation as when the projection is rotated by 90 degrees it will not look the same.

To illustrate this, when considering the rotation of the final position in Figure 66, the semi-circle

shapes are directed towards the south east, but if the projection were to be rotated clockwise by 90

degrees, the semi-circle shapes would instead be directed towards the west.

Utilising four shapes at the current position and at the desired final position allows for occlusion;

even if one or more of the shapes is occluded or distorted by the large physical object or others in

the vicinity, the SAR visual cue is still able to be understood. The four shapes also aid the SAR

visual cue in being viewed from multiple vantage points which may be especially useful for the

arrangement of large physical objects as multiple users may partake in the arrangement process.

89

Page 100: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 67 Large Object Arrangement: Initial Position

In the scenario presented in Figure 67, the user must translate the large physical object in a south

easterly direction. The four arrows are present on the projection at the large physical object’s

current position. These arrows are indicating that the large physical object should be rotated in a

clockwise direction. This rotation is also indicated by the rotation that has been applied to the

projection of the shapes at the desired final position.

90

Page 101: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 68 Large Object Arrangement: After a Translation

Figure 68 presents the large physical object after a translation has been performed. The user has

translated the object in the correct direction, and the projection has updated to show the current

position of the large physical object.

91

Page 102: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 69 Large Object Arrangement: Translation Complete

Figure 69 presents the large physical object after the translation has been completed. The arrows

remain, indicating to the user that the large physical object must be rotated in a clockwise direction.

Occlusion occurs frequently when utilising SAR visual cues to arrange large physical objects; this

occlusion is clearly evident in Figure 69. This illustrates one of the reasons for utilising four

projected shapes at the current position and the desired final position; the SAR visual cue is still

able to be understood even if one or more of the shapes is occluded or distorted.

92

Page 103: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 70 Large Object Arrangement: Rotation

Figure 70 presents the large physical object after a rotation has been completed. The projection has

been updated as the user rotates the large physical object. It is evident in Figure 70 that the object is

almost successfully arranged, and the user can successfully arrange the object by aligning the

shapes which is achieved by rotating the large physical object in the direction indicated by the

arrows.

93

Page 104: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Figure 71 Large Object Arrangement: Successfully Arranged

Figure 71 presents the successful arrangement of the large physical object. The arrows indicating

that a rotation is to be performed are no longer projected, but will return if the physical object is

moved from the correct final position.

94

Page 105: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

9 Implementation

The developed system described in this dissertation utilises SAR to instruct a user on how to

arrange a physical object. To achieve this, existing components have been utilised and new

components have been developed. A description of how the existing and developed components are

structured to form the developed system is provided as an overview of the developed system in

Section 9.1.

The existing components utilised are a SAR system and a tracking framework. These existing

components are described in Section 9.2. Several methods for tracking were also explored. The

new components that have been developed during this research are a module for the existing SAR

system that implements object arrangement techniques and a top down arrangement application

that aids the object arrangement process. These developed components are described in Section 9.3.

9.1 Overall Structure

Figure 72 Overall Implemented System Structure

95

Page 106: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

The research described in this dissertation utilised several components. An overall structure of the

system incorporating these existing and new components is depicted in Figure 72. The research

utilises an existing SAR system for which a module was created and other existing modules were

utilised. An accompanying top down arrangement application was also implemented to aid the user

in arranging physical objects. A tracking framework and camera were utilised in conjunction with

the SAR system. These components required the use of an existing tracking calibration module. A

projector was also used in conjunction with the SAR system, which required the use of an existing

projector calibration module. As the existing SAR system was implemented and runs in UNIX, the

implemented system was developed in a UNIX environment, on a computer utilising Ubuntu 9.10.

9.2 Existing Components

Existing components were utilised to enable the system to function. These included a SAR system

described in Section 9.2.1, a tracking framework described in Section 9.2.2 and associated

hardware.

9.2.1 SAR System

The existing SAR system is a framework upon which SAR applications can be developed. The

SAR system is developed in C/C++ and OpenGL. A library has also been developed, libwcl, which

allows for further resources, including mathematical classes, that can be utilised directly, and

which are utilised by the SAR system. Modules can be developed and integrated in the SAR

system, allowing for the use of previously developed SAR resources. The module developed to

implement the research described in this dissertation is discussed in Section 9.3.1. The SAR system

utilised and the libwcl library were developed within the Wearable Computer Lab at the University

of South Australia1.

The existing SAR system allows for the use of one or more projectors. The implementation

described in this dissertation utilised one projector. Also present is a projector coordinate space,

which can be calibrated using a projector calibration module present within the existing SAR

system. This projector coordinate space allows for the precise projection of images upon a desired

position. For example, the projector coordinate space could be defined such that the x axis and the

y axis are parallel to a flat surface such as a table, and the z axis is perpendicular, allowing for the

projection of two dimensional images upon the surface and the projection of three dimensional

images upon a physical object located on the surface.

1 Developed by Michael Marner ([email protected]), Markus Broecker

([email protected]) and Benjamin Close ([email protected]).96

Page 107: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

At least one camera and one projector are to be connected to the computer on which the SAR

system is installed. The camera and projector are to be positioned so that their field of view

overlaps, allowing for the projector coordinate space and the ARToolKitPlus coordinate space to

overlap.

Projector calibration is performed to define the projector coordinate space. This allows for the x

axis and y axis to be parallel with the ground plane and the z axis to be perpendicular to it. This

also allows for the scale to be defined according to the physical objects to be arranged. In the

system implemented the projector calibration was undertaken such that a one millimetre distance

along a physical model corresponded with a one unit distance along the corresponding virtual

model. Thus, a unit in the projector coordinate space corresponded with a millimetre in the physical

world. Tracking calibration is then performed to provide a transformation between the

ARToolKitPlus coordinate space and the previously defined projector coordinate space.

The existing SAR system provides predefined resources for the input of models and the rendering

and display of these models using the aforementioned projector coordinate space. Also provided is

a structure for implementing and utilising existing tracking frameworks. Support for input devices

was also present, of which the keyboard input was utilised for user study functionality. Support for

hardware such as cameras and projectors is also provided. The existing SAR system is configurable

using XML files. The modules to use can be defined within an XML file, settings for the loaded

models can be specified and the resources to be loaded can be determined. SAR can then be

executed with the appropriate XML configuration file.

9.2.2 Tracking

The physical objects to be arranged must be tracked to allow for the SAR visual cues to function as

described. The developed system utilises the ARToolKitPlus tracking framework (Wagner &

Schmalstieg 2007) which allows for the obtainment of a marker position and orientation in a real

time environment.

The tracking calibration process generates a transformation matrix that allows the mapping of the

ARToolKitPlus coordinate space to the SAR projector coordinate space. This allows a conversion

between a set of coordinates in the ARToolKitPlus coordinate space to the SAR projector

coordinate space. An example of the use of this conversion is in the scenario of projecting upon a

tracked physical object. By fixing fiduciary markers at a known position upon the physical object,

the tracking framework returns the coordinates of this position. Through applying the

transformation matrix, the corresponding SAR projector coordinate of this position can be

calculated, allowing for projection upon on this position using the SAR system.

97

Page 108: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

By utilising tracked physical objects according to the described process, the SAR visual cues are

able to be updated as the physical object is moved. For example, if the user is prompted to translate

an object according to the Translation Instruction, the user may move the physical object and have

the circle projection follow the object and the line projection show the updated path to the final

position. As the tracked physical object is moved, the ARToolKitPlus coordinates of the tracked

physical object (obtained from the marker) are periodically converted to coordinates in the SAR

projector coordinate space, allowing the projection to follow the tracked physical object.

Utilising tracked physical objects according to the described process also enables the system to

verify whether a tracked physical object has been successfully rotated and translated. For example,

if the user is prompted to translate a tracked physical object according to the Translation

Instruction, the user may move the physical object so that the circle projections overlap. The

Translation Instruction will cease to be displayed when the ARToolKitPlus coordinates of the

tracked physical object (obtained from the marker) are equivalent to the coordinates of the physical

object’s desired final translation in the projector coordinate space.

9.2.2.1 Infrared Tracking

The possibility of utilising Hewlett Packard’s UV/IR Invisible Ink HP Q7485A was explored. It

was foreseen that by utilising this ink to produce ARToolKitPlus markers in conjunction with a

camera that could view infrared, physical object tracking could be undertaken. This possibility was

explored and deemed unfeasible at the current time as the ink and devices utilised did not allow for

this functionality, however it is a possible direction for future work.

9.3 Developed Components

A SAR module was developed for the existing SAR system described in Section 9.2.1, which

implements SAR visual cues to instruct a user on how to arrange a physical object. A top down

arrangement application was developed to aid the object arrangement process.

9.3.1 SAR Module

A SAR module has been developed in C/C++ and OpenGL to implement the SAR visual cues

described in Section 3.2.3. The utilisation of the SAR system allowed for the module developed to

utilise predefined rendering and display resources, the easy integration of the ARToolKitPlus

tracking framework, the easy integration of hardware, use of the definable projector coordinate

space, the use of input devices and a method for configuration using XML.

98

Page 109: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

The module reads in a desired final translation and rotation position for each object, this can be

obtained from the top down arrangement application or from an external file. A SAR visual cue

then instructs a user on how to arrange each tracked physical object to their respective desired final

positions.

The module utilises the ARToolKitPlus tracking framework provided by the existing SAR system.

The positions of two ARToolKitPlus markers are tracked by the module, one representing the

centre of the current tracked physical object, and the other present to allow for rotation calculations

within the developed module.

The module implements an update method which is called by the SAR system periodically. Within

this method, a check is performed to determine whether the top down arrangement application has

been altered. If the top down arrangement application has been altered, the updated final position

information is gathered and the projection is updated.

The object arrangement process as defined in Section 3.2 is implemented. The update method also

performs a check to determine the state of the current tracked physical object to be arranged. The

current object to be arranged is assigned a state. The possible states of the current object are

translation, rotation, and successfully arranged. Once the object has been assigned the successfully

arranged state and it has been verified that the translation and rotation are correct, the next object

becomes the current object to be arranged. The order of the objects is determined by the order they

are entered in the XML configuration file.

When verifying that a tracked physical object has been correctly translated and rotated, some

leeway is given so that the user does not have to arrange the physical object precisely to the desired

final position. This was implemented as it was observed to be difficult and time consuming for the

user to arrange the physical object precisely to the desired final position, and the leeway given still

allows for the improved accuracy of SAR visual cues over a manual arrangement technique, as

determined from the accuracy information gathered from the user study (Section 5.2.2). When

translating a tracked physical object, the user is required to apply a translation to the object that is

within ten units of the desired final position of the object, along both the x axis and the y axis. In

the system implemented, this ten unit distance corresponds with approximately ten millimetres.

When rotating a tracked physical object, the user is required to apply a rotation to the object that is

within ten degrees of the desired final rotation.

The module also implements a draw method in which predefined rendering and display resources

provided by the SAR system are used along with OpenGL techniques to implement the SAR visual

cues. A SAR visual cue is displayed instructing the user to arrange the current object, in accordance

to the current object’s state.99

Page 110: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Keyboard input is utilised to undertake the user study. This functionality allows for the

specification of which SAR visual cue is utilised.

9.3.2 Top Down Arrangement Application

An application was developed to provide a user interface which allows a user to arrange top down

representations of the physical objects. The resulting arrangement can then be performed on the

physical objects using SAR cues. As described in Section 3.1, this application can be run in real

time for a collaboration task or the arrangement generated can be stored to provide instructions at a

later date.

Figure 73 Top Down Arrangement Application Screenshot

Figure 73 presents a screenshot of three representations of physical objects being arranged within

the application. The user is able to rotate and translate the representations with the arrow keys to

determine a desired arrangement. Multiple objects are able to be loaded and the current object to be

arranged is selected from the drop down menu. In the scenario depicted by Figure 73, the user has

added three representations of physical objects and has arranged them. The transformation matrices

for each object are output to a file periodically and information can either be communicated to the

developed module within the SAR system in real time, or can be saved in the file and retrieved

from the file at a later date. When the information is communicated in real time, this allows for the

representations to be manipulated even after the SAR visual cues have been carried out upon the 100

Page 111: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

corresponding physical objects. This is possible as the arrangement of each physical object is

revisited if the physical object is not arranged in the correct position according to the

transformation matrices communicated to the module.

101

Page 112: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

10 Conclusion

This dissertation presents several significant contributions which address limitations found in the

related research. A unique approach is presented to instruct a user on how to arrange physical

objects using visual cues implemented with SAR. This approach has been implemented, validated

with a user study and improved upon. Two instructions have been defined to arrange a physical

object into all possible positions along a ground plane. A set of SAR visual cues have been

developed which utilise these instructions or variations of them. A survey was undertaken of

related research, this showed that object manipulation using AR and SAR systems is largely

focused on the manipulation of virtual objects, which causes limitations. The research presented in

this dissertation addresses these limitations by providing a method for directly arranging physical

objects using SAR. Interaction with the system developed has been addressed which has resulted in

the development of a top down arrangement application to aid the arrangement of objects using

SAR. SAR object arrangement can be undertaken locally or remotely, in real time by one user or as

a collaboration task. The process can also be stored to provide instructions on demand at a later

date. A user study was undertaken to determine the effectiveness of these SAR visual cues when

compared to a manual arrangement method and when compared to one another.

A research question was initially proposed. This research question is:

What are the appropriate Spatial Augmented Reality visual cues to instruct a user on how

to arrange real objects?

This research question has been answered by the research described in this dissertation. From the

results of the user study, a recommendation for either the Both Instructions Cue or the Square Cue

was made. In addition to receiving a favourable user opinion, they implement features consistent

with the advantages found with the results; they are both SAR visual cues that implement the

Translation and Rotation Instructions and thus are quicker and more accurate than a manual

arrangement, as well as being more intuitive.

102

Page 113: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

The research described in this dissertation has made a number of contributions. Contributions

described in this dissertation include:

a survey of existing technologies,

SAR instructions for rotation and translation that can be used in conjunction to arrange a

physical object,

a set of validated and implemented SAR visual cues that utilise the developed instructions

or versions of these instructions,

the integration of the SAR visual cues to form a system that can instruct a user to arrange

physical objects,

a two-dimensional application that has been incorporated into the working system which

aids the physical object arrangement process,

an evaluation of the effectiveness of the SAR visual cues developed,

a revised SAR visual cue that considers the results of the user study, and

an adaptation of a SAR visual cue which can arrange large physical objects.

SAR visual cues have been developed that are quicker and more accurate than a manual

arrangement method; validating the hypothesis presented by the user study. The Translation

Instruction and Rotation Instruction were more intuitive than variations of these that were utilised.

In addition, a revised visual cue has been developed by taking into account the results of the user

study, and a pilot study regarding this new visual cue has indicated improvements in all areas in

which data was gathered: in terms of speed, accuracy, intuitiveness and user opinion. Furthermore,

the SAR visual cues developed have been shown to facilitate the arrangement of physical objects of

a variety of sizes, including large objects. Using SAR in this way could be useful for the

arrangement of large objects, such as furniture or theatre sets, and when multiple objects are being

arranged.

10.1 Future Work

Although the techniques and concepts presented in this dissertation are implemented to form a

system that employs SAR visual cues to arrange physical objects, there is potential for the

improvement and continuation of this research. This section highlights several ideas.

103

Page 114: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

This research is focused on the arrangement of physical object upon a ground plane. Future work

can be undertaken to allow for more arrangement options. For example, a vertical translation along

the z axis would allow physical objects to be stacked and the ability to perform rotations around the

x axis and y axis would allow physical objects to be flipped. It is envisaged that instructions such

as flip and stack could be implemented and incorporated into the object arrangement process

described in Section 3.2 along with the already implemented Translation and Rotation Instructions.

Additionally, the order in which multiple physical objects are arranged could be taken into account.

The implementation allows for multiple physical objects to be arranged, however the order in

which they are arranged is not currently considered. This could be useful in the following example:

when arranging multiple physical objects, the user may be presented with a situation where the

object to be arranged must be translated to a position that intersects other physical objects, making

the instruction difficult to perform.

As the focus of this research was on the effectiveness of SAR visual cues for physical object

arrangement, an existing tracking framework was utilised. There are limitations in the existing

tracking framework and thus further study into a tracking method that is better suited to the

application is warranted. In addition, the infra red tracking described in Section 9.2.2.1 could be

revisited with different components utilised.

104

Page 115: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

11 References

Ahlers, KH, Kramer, A, Breen, DE, Chevalier, P-Y, Crampton, C, Rose, E, Tuceryan, M, Whitaker, RT & Greer, D 1995, Distributed Augmented Reality for Collaborative Design Applications, Maastricht.

Azuma, R, Baillot, Y, Behringer, R, Feiner, S, Julier, S & MacIntyre, B 2001, 'Recent Advances in Augmented Reality', IEEE Comput. Graph. Appl., vol. 21, no. 6, pp. 34-47.

Azuma, RT 1997, 'A Survey of Augmented Reality', Presence: Teleoperators and Virtual Environments, vol. 6, pp. 355–385.

Bimber, O, Grundhofer, A, Wetzstein, G & Knodel, S 2003, Consistent Illumination within Optical See-Through Augmented Environments, IEEE Computer Society, p. 198.

Bimber, O & Raskar, R 2005, Spatial Augmented Reality: Merging Real and Virtual Worlds, A K Peters, Wellesley, Massachusetts.

Breen, DE, Whitaker, RT, Rose, E & Tuceryan, M 1996, Interactive Occlusion and Automatic Object Placement for Augmented Reality.

Bryson, S, Zeltzer, D, Bolas, MT, Chapelle, BdL & Bennett, D 1996, The Future of Virtual Reality: Head Mounted Displays Versus Spatially Immersive Displays, ACM, pp. 485-486.

Cotting, D, Naef, M, Gross, M & Fuchs, H 2004, 'Embedding Imperceptible Patterns into Projected Images for Simultaneous Acquisition and Display', paper presented at the Mixed and Augmented Reality, 2004. ISMAR 2004. Third IEEE and ACM International Symposium on.

105

Page 116: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Do, TV & Lee, J-W 2010, '3DARModeler: a 3D Modeling System in Augmented Reality Environment', International Journal of Electrical, Computer, and Systems Engineering, vol. 4, no. 2, pp. 145-154.

Drascic, D & Milgram, P 1996, Perceptual Issues in Augmented Reality.

Ehnes, J, Hirota, K & Hirose, M 2004, Projected Augmentation - Augmented Reality using Rotatable Video Projectors, IEEE Computer Society, pp. 26-35.

Feiner, S, MacIntyre, B, Hollerer, T & Webster, A 1997, A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment, IEEE Computer Society, p. 74.

Feiner, S, Macintyre, B & Seligmann, D 1993, 'Knowledge-based augmented reality', Communications of the ACM, vol. 36, no. 7, pp. 53-62.

Fjeld, M, Voorhorst, F, Bichsel, M, Lauche, M, Rauterberg, M & Krueger, H 1999, 'Exploring Brick-Based Navigation and Composition in an Augmented Reality', Handheld and Ubiquitous Computing, vol. 1707, pp. 102-116.

Gausemeier, J, Fruend, J & Matysczok, C 2002, AR-Planning Tool: Designing Flexible Manufacturing Systems With Augmented Reality, Eurographics Association, Barcelona, Spain, pp. 19-25.

Gorbet, MG, Orth, M & Ishii, H 1998, Triangles: Tangible Interface for Manipulation and Exploration of Digital Information Topography, ACM Press/Addison-Wesley Publishing Co., Los Angeles, California, United States, pp. 49-56.

Grasset, R & Gascuel, J-D 2002, MARE: Multiuser Augmented Reality Environment on Table Setup, ACM, San Antonio, Texas, pp. 213-213.

Höllerer, T 2001, 'User Interface Management Techniques for Collaborative Mobile Augmented Reality', Computers & Graphics, vol. 25, no. 5, pp. 799-810.

106

Page 117: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Irawati, S, Green, S, Billinghurst, M, Duenser, A & Ko, H 2006a, An Evaluation of an Augmented Reality Multimodal Interface Using Speech and Paddle Gestures, Hangzhou, China, pp. 272-283.

Irawati, S, Green, S, Billinghurst, M, Duenser, A & Ko, H 2006b, '"Move the Couch Where?": Developing an Augmented Reality Multimodal Interface', Symposium on Mixed and Augmented Reality, Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 183-186.

Ishii, H, Ben-Joseph, E, Underkoffler, J, Yeung, L, Chak, D, Kanji, Z & Piper, B 2002, Augmented Urban Planning Workbench: Overlaying Drawings, Physical Models and Digital Simulation, IEEE Computer Society, p. 203.

Ishii, H & Ullmer, B 1997, Tangible Bits: Towards Seamless Interfaces Between People, Bits and Atoms, ACM, Atlanta, Georgia, United States, pp. 234-241.

Kato, H & Billinghurst, M 1999, Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System, IEEE Computer Society, p. 85.

Kato, H, Billinghurst, M, Poupyrev, K, Imamoto, K & Tachibana, K 2000, 'Virtual Object Manipulation on a Table-Top AR Environment', Human Interface, pp. 275-278.

Kitamura, Y, Itoh, Y & Kishino, F 2001, Real-time 3D interaction with ActiveCube, ACM, Seattle, Washington, pp. 355-356.

Kitamura, Y & Kishino, F 1997, Consolidated Manipulation of Virtual and Real objects, ACM, Lausanne, Switzerland, pp. 133-138.

Kitamura, Y, Ogata, S & Kishino, F 2002, A Manipulation Environment of Virtual and Real Objects Using a Magnetic Metaphor, ACM, Hong Kong, China, pp. 201-207.

Lee, G, Billinghurst, M & Kim, GJ 2004, Occlusion Based Interaction Methods For Tangible Augmented Reality Environments, Singapore, pp. 419-426.

107

Page 118: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

MacIntyre, B, Gandy, M, Dow, S & Bolter, JD 2004, DART: A Toolkit for Rapid Design Exploration of Augmented Reality Experiences, ACM, Santa Fe, NM, USA, pp. 197-206.

Mackay, WE 1998, Augmented Reality: Linking Real and Virtual Worlds: A New Paradigm for Interacting with Computers, L'Aquila, Italy.

Malbezin, P, Piekarski, W & Thomas, BH 2002, Measuring ARToolKit Accuracy in Long Distance Tracking Experiments, Darmstadt, Germany.

Marner, MR & Thomas, BH 2010, 'Augmented Foam Sculpting for Capturing 3D Models', paper presented at the 3D User Interfaces (3DUI), 2010 IEEE Symposium on.

Milgram, P & Kishino, F 1994, 'A Taxonomy of Mixed Reality Visual Displays', IEICE Transactions on Information and Systems, vol. E77-D, no. 12, pp. 1321-1329.

Milgram, P, Takemura, H, Utsumi, A & Kishino, F 1994, 'Augmented Reality: A Class of Displays on the Reality-Virtuality Continuum', Telemanipulator and Telepresence Technologies, vol. 2351, pp. 282-292.

Naimark, L & Foxlin, E 2005, 'Encoded LED System for Optical Trackers', paper presented at the Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'05), Vienna, Austria.

Newman, J 2001, 'Augmented Reality in a Wide Area Sentient Environment', paper presented at the IEEE and ACM International Symposium on Augmented Reality.

Park, J, You, S & Neumann, U 1999, Natural Feature Tracking for Extendible Robust Augmented Realities, A. K. Peters, Ltd., Bellevue, Washington, United States, pp. 209-217.

Raskar, R & Low, K-L 2001, Interacting With Spatially Augmented Reality, ACM, Camps Bay, Cape Town, South Africa, pp. 101-108.

Raskar, R, Welch, G & Fuchs, H 1998, Spatially Augmented Reality, A. K. Peters, Ltd., Bellevue, Washington, United States, pp. 63-72.

108

Page 119: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Raskar, R, Welch, G, Low, K-l & Bandyopadhyay, D 2001, 'Shader Lamps: Animating Real Objects with Image-Based Illumination', paper presented at the 12th Eurographics Workshop on Rendering (EGWR) London, June.

Raskar, R, Welch, G & Chen, W 1999, Table-Top Spatially-Augmented Reality: Bringing Physical Models to Life with Projected Imagery, p. 64.

Rekimoto, J & Nagao, K 1995, The World through the Computer: Computer Augmented Interaction with Real World Environments, Pittsburgh, Pennsylvania, United States, pp. 29-36.

Rekimoto, J, Ullmer, B & Oba, H 2001, DataTiles: a modular platform for mixed physical and graphical interactions, ACM, Seattle, Washington, United States, pp. 269-276.

Schmalstieg, D, Fuhrmann, A & Hesina, G 2000, Bridging Multiple User Interface Dimensions with Augmented Reality.

Schmalstieg, D, Fuhrmann, A, Hesina, G, Szalavari, Z, Encarnacao, L.M, Gervautz, M & Purgathofer, W 2002, 'The Studierstube Augmented Reality Project', Presence, vol. 11, no. 1, pp. 33-54.

Suganuma, A, Ogata, Y, Shimada, A, Arita, D & Taniguchi, R-i 2008, Billiard Instruction System for Beginners with a Projector-Camera System, ACM, Yokohama, Japan, pp. 3-8.

Sutherland, IE 1968, A Head-Mounted Three Dimensional Display, ACM, San Francisco, California, pp. 757-764.

Szalavári, Z & Gervautz, M 1997, The Personal Interaction Panel - a Two-Handed Interface for Augmented Reality.

Ullmer, B & Ishii, H 1997, The metaDESK: models and prototypes for tangible user interfaces, ACM, Banff, Alberta, Canada, pp. 223-232.

109

Page 120: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Voida, S, Podlaseck, M, Kjeldsen, R & Pinhanez, C 2005, A Study on the Manipulation of 2D Objects in a Projector/Camera–Based Augmented Reality Environment, Portland, Oregon.

Wagner, D & Schmalstieg, D 2007, 'ARToolKitPlus for Pose Tracking on Mobile Devices', paper presented at the 12th Computer Vision Winter Workshop, St. Lambrecht, Austria.

Wang, X & Gong, Y 2007, Augmented Virtuality Space: Enriching Virtual Design Environments with Reality, Brisbane, Australia.

Webster, A, Feiner, S, MacIntyre, B, Massie, W & Krueger, T 1996, Augmented Reality in Architectural Construction, Inspection, and Renovation.

Wellner, P 1991, The DigitalDesk calculator: tangible manipulation on a desk top display, ACM, Hilton Head, South Carolina, United States, pp. 27-33.

Wellner, P 1993, 'Interacting with Paper on the DigitalDesk', Commun. ACM, vol. 36, no. 7, pp. 87-96.

You, S, Neumann, U & Azuma, R 1999, Hybrid Inertial and Vision Tracking for Augmented Reality Registration, IEEE Computer Society, p. 260.

Zhang, X, Fronz, S & Navab, N 2002, Visual Marker Detection and Decoding in AR Systems: A Comparative Study, IEEE Computer Society, p. 97.

Zhou, F, Duh, HB-L & Billinghurst, M 2008, Trends in Augmented Reality Tracking, Interaction and Display: A Review of Ten Years of ISMAR, IEEE Computer Society, pp. 193-202.

110

Page 121: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Appendix A User Study Survey

111

Page 122: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Visual Cues for Spatial Augmented Reality User Study Questionnaire

Age: ___Gender: M/F

What is your experience with Augmented Reality systems?None Some Experienced

Have you ever interacted with a Spatial Augmented Reality System before?

The following are visual cues that can be projected onto the top of a rectangular object. The view is a top down, “bird's eye” view.Each one of the four cues instructs the rectangle to be moved in one of the following ways:

rotate the object (by 45 degrees). translate the object (along an x and y axis which is flat on the page).

The only rotation allowed is 45 degrees.There is a movement performed in each scenario.Please draw what you believe the outcome of the cues might be, if they were to be applied to the rectangular object. You may draw over the drawings provided.

Cue #1:

Cue #2:

112

Page 123: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Cue #3 (the leftmost rectangle is the original object):

Cue #4

113

Page 124: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Scenarios: Scenario 1: No SAR Scenario 2: Line Cue: Translation First Scenario 3: Line Cue: Rotation First Scenario 4: Line Cue: Both Cues At The Same Time Scenario 5: Circle Cue Scenario 6: Line Cue: With wireframe, with final destination Scenario 7: Line Cue: With wireframe, without final destination Scenario 8: Line Cue: With square, with final destination Scenario 9: Line Cue: With square, without final destination

Which do you think was better: Scenario 1: No SAR, or Scenario 2: Line Cue: Translation first?

Why did you think this?

Which do you think was better: Scenario 2: Line Cue: Translation first, or Scenario 3: Line Cue: Rotation first, or Scenario 4: Line Cue: Both Cues At The Same Time?

Why did you think this?

Which do you think was better: Scenario 2: Line Cue: Translation first, or Scenario 5: Circle Cue?

Why did you think this?

114

Page 125: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Which do you think was better: Scenario 6: Line Cue: With wireframe, with final destination, or Scenario 7: Line Cue: With wireframe, without final destination?

Why did you think this?

Which do you think was better: Scenario 8: Line Cue: With square, with final destination, or Scenario 9: Line Cue: With square, without final destination?

Why did you think this?

Which do you think was better: The scenarios with the square cues (Scenario 8, Scenario 9) The scenarios with the wireframe cues (Scenario 6, Scenario 7)

Why did you think this?

115

Page 126: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Which of the line cues did you think was better: Scenario 2: Line Cue: Translation First, or Scenario 3: Line Cue: Rotation First, or Scenario 4: Line Cue: Both Cues At The Same Time, or Scenario 6: Line Cue: With wireframe, with final destination, or Scenario 7: Line Cue: With wireframe, without final destination, or Scenario 8: Line Cue: With square, with final destination, or Scenario 9: Line Cue: With square, without final destination?

Why did you think this?

Anything else you would like to add?

116

Page 127: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Appendix B Pilot Study Survey

117

Page 128: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

The following is a visual cue that can be projected onto the top of a rectangular object. The

view is a top down, “bird’s eye” view.

The cue instructs the rectangle to be moved in one of the following ways:

Rotate the object (by 45 degrees)

Translate the object (along an x and y axis which are flat on the page).

The only rotation allowed is 45 degrees. There is a movement performed in each scenario.

Please draw what you believe the outcome of the cues might be, if they were to be applied

to the rectangular object. You may draw over the drawings provided.

Cue #1

Cue #2

118

Page 129: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Was this visual cue better than the ones experienced in the user study?

What was better with this one?

What was worse with this one?

Any other comments?

119

Page 130: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Appendix C Data Tables for Included Results

120

Page 131: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

User Study Timer Data (milliseconds)

Manual Timer Participant 1 Participant 2 Participant 3 Participant 4Task 1 26652 16392 58099 23852Task 2 18922 13191 20921 23521Task 3 16924 10661 14660 68760Task 4 10393 29850 27716 23853Task 5 15857 36113 49171 41576Task 6 10527 40243 9861 32115Task 7 11326 14523 17723 31713Task 8 12391 8395 16391 24918Task 9 11060 18654 13326 19055Average of all Tasks 13425 21453.75 21221.125 33188.875

Manual Timer Participant 5 Participant 6 Participant 7 Participant 8Task 1 15591 44507 36113 23184Task 2 16790 46639 36779 22920Task 3 11726 30782 16258 32114Task 4 16123 77686 15991 30781Task 5 53302 67026 29450 25985Task 6 13458 24119 8395 17723Task 7 17056 16789 22121 27050Task 8 10927 20388 20521 19588Task 9 15458 14524 16123 13192Average of all Tasks 19355 37244.125 20704.75 23669.125

Manual Timer Participant 9 Participant 10 Participant 11 Participant 12Task 1 31182 50769 32381 16257Task 2 28517 33312 67959 10395Task 3 16656 17456 29450 26917Task 4 29182 98074 16124Task 5 37710 71025 54101Task 6 20520 25850 30381 13325Task 7 10394 23986 59965 19321Task 8 5997 15326 104336 16655Task 9 20522 15325 25452 23452Average of all Tasks 18826.85714 24137.85714 60830.25 22536.25

121

Page 132: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Manual Timer Participant 13 Participant 14 Participant 15 Participant 16Task 1 47838 33181 44106 22653Task 2 17589 79019 47038 21854Task 3 31980 29715 19320 10661Task 4 18789 71024 32514 16790Task 5 21852 118061 19855 26252Task 6 17856 67560 9594 27716Task 7 16258 13858 12394 15193Task 8 23586 77021 13859 18522Task 9 17456 48238 10659 8795Average of all Tasks 20670.75 63062 20654.125 18222.875

Manual Timer Participant 17 Participant 18 Participant 19 Participant 20 AverageTask 1 55568 14259 21586 32780 32347.5Task 2 21453 47438 20121 23985 30918.15Task 3 16923 22519 15858 11059 22519.95Task 4 12925 24252 11726 11860 30297.53Task 5 48904 8394 34247 23053 41154.42Task 6 26518 12126 9328 12394 21480.45Task 7 24784 4798 14257 11992 19275.05Task 8 15592 6263 13059 9195 22646.5Task 9 26385 10661 13991 13193 17776.05Average of all Tasks 24185.5 17056.375 16573.375 14591.375

122

Page 133: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

SAR Timer Participant 1 Participant 2 Participant 3 Participant 4Task 2 15456 6527 21586 19589Task 3 11726 3730 9728 52368Task 4 6394 24519 19053 18655Task 5 11992 29849 42907 34112Task 6 4264 33445 13458 53567Task 7 8127 8794 11993 82218Task 8 8396 2930 11725 13989Task 9 7460 11993 9195 13192Average of all SAR Tasks 9226.875 15223.375 17455.625 35961.25

SAR Timer Participant 5 Participant 6 Participant 7 Participant 8Task 2 12258 35045 23718 16124Task 3 7461 24519 10262 25718Task 4 11726 71956 9860 24652Task 5 48903 61029 20920 20920Task 6 8260 18255 2663 11992Task 7 12126 11060 17457 21986Task 8 4930 14391 15857 14258Task 9 10793 9194 11725 8796Average of all SAR Tasks 14557.125 30681.125 14057.75 18055.75

SAR Timer Participant 9 Participant 10 Participant 11 Participant 12Task 2 21986 26917 57566 5331Task 3 12525 12257 22386 19454Task 4 23449 9728 89145 11593Task 5 24252 27717 63695 50102Task 6 16522 21186 22920 8530Task 7 5463 18121 50769 14792Task 8 2665 9994 77418 12526Task 9 15457 11327 20120 13589Average of all SAR Tasks 15289.875 17155.875 50502.375 16989.625

SAR Timer Participant 13 Participant 14 Participant 15 Participant 16Task 2 11859 69824 39579 17989Task 3 27583 24251 10526 7062Task 4 14392 66760 12124 13191Task 5 18122 107003 15456 23319Task 6 13992 53434 4797 24785Task 7 11859 8396 8928 11724Task 8 18921 64761 10794 12925

123

Page 134: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Task 9 13458 39713 7729 3463Average of all SAR Tasks 16273.25 54267.75 13741.625 14307.25

SAR Timer Participant 17 Participant 18 Participant 19 Participant 20 AverageTask 2 13591 42773 13991 19989 24584.9Task 3 10926 19722 10661 7329 16509.7Task 4 5996 21320 6262 6662 23371.85Task 5 42638 4797 28381 17989 34705.15Task 6 20254 9061 3598 7462 17622.25Task 7 18520 2133 8929 8262 17082.85Task 8 7595 3596 7728 5463 16043.1Task 9 20655 7329 8259 9329 12638.8Average of all SAR Tasks 17521.875 13841.375 10976.125 10310.625

124

Page 135: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

User Study Accuracy Data (units)

Accuracy Participant 1 - Variation Participant 2 - Variation Participant 3 - VariationTask 1 x 1.039297 6.099629 13.084367Task 1 y 42.174904 44.340803 31.232368Task 1 Angle 14.130518 1.41048 3.495705Task 2 x 9.311109 4.637155 4.850228Task 2 y 3.332299 2.065624 1.994922Task 2 Angle 3.10447 8.024928 8.644944Task 3 x 4.334879 1.245961 7.552184Task 3 y 2.352554 0.86928 1.708735Task 3 Angle 1.311248 7.168728 7.480713Task 4 x 2.44157 3.501189 4.577567Task 4 y 5.850972 7.319244 10.311855Task 4 Angle 4.954148 2.886078 5.141129Task 5 x 2.64451 4.315219 4.204956Task 5 y 1.539177 0.647657 1.814049Task 5 Angle 6.377228 2.305633 3.691101Task 6 x 4.084566 1.593704 15.525131Task 6 y 5.527533 3.513853 10.870038Task 6 Angle 9.706848 8.089829 3.221551Task 7 x 5.523292 0.349215 0.565146Task 7 y 3.459471 1.137969 6.524606Task 7 Angle 8.326416 2.464569 6.985962Task 8 x 5.176577 5.492001 3.910662Task 8 y 2.807469 2.218419 2.753907Task 8 Angle 7.071961 5.27073 2.411071Task 9 x 5.016434 15.947254 0.78761Task 9 y 4.106385 16.885332 0.989237Task 9 Angle 6.443856 7.074723 8.928337

Accuracy Participant 4 - Variation Participant 5 - Variation Participant 6 - VariationTask 1 x 55.704788 12.779458 7.732371Task 1 y 48.640778 34.267438 12.991065Task 1 10.747589 4.417854 7.964401

125

Page 136: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

AngleTask 2 x 1.791216 8.671446 4.540201Task 2 y 7.29361 0.661729 3.445666Task 2 Angle 8.386962 7.574997 7.924637Task 3 x 2.577838 5.25909 5.515763Task 3 y 0.868942 5.089246 3.847748Task 3 Angle 6.534317 5.190176 9.109069Task 4 x 1.401345 0.214374 4.96145Task 4 y 7.527944 4.250156 1.534501Task 4 Angle 4.360626 2.325547 4.54419Task 5 x 1.185389 1.390412 8.902074Task 5 y 0.262309 0.528527 5.192332Task 5 Angle 3.703746 8.057343 3.836151Task 6 x 19.97283 5.697926 1.112047Task 6 y 12.22551 5.176905 1.318666Task 6 Angle 3.221665 7.490673 8.192505Task 7 x 4.899351 0.866437 0.371613Task 7 y 12.796405 5.831572 8.08445Task 7 Angle 5.579986 8.33194 3.785754Task 8 x 7.664078 1.279692 9.282825Task 8 y 13.983102 3.367409 7.383993Task 8 Angle 1.873734 2.758267 2.538553Task 9 x 9.174715 0.767815 4.74629Task 9 y 8.187706 7.770797 3.326917Task 9 Angle 4.32663 4.660461 5.0009

Accuracy Participant 7 - Variation Participant 8 - Variation Participant 9 - VariationTask 1 x 6.993693 10.920417 98.685171Task 1 y 24.008659 19.016226 24.969531Task 1 Angle 4.509575 5.060087 2.397989Task 2 x 5.987828 19.011801 4.674298Task 2 y 3.929651 6.513794 7.673532Task 2 Angle 3.070907 16.490707 5.644791Task 3 x 1.346825 0.290998 9.080372Task 3 y 3.104461 1.908415 4.917646Task 3 7.949072 6.712524 8.499989

126

Page 137: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

AngleTask 4 x 5.317802 2.871913 2.782842Task 4 y 2.960601 1.289673 5.512639Task 4 Angle 1.638122 5.017792 7.400635Task 5 x 4.842592 1.965225 1.069937Task 5 y 0.304081 4.254856 3.096462Task 5 Angle 4.189408 5.106949 6.84111Task 6 x 3.423531 1.507248 7.30256Task 6 y 0.533269 9.332173 5.236787Task 6 Angle 4.706054 7.595883 7.250984Task 7 x 1.591043 3.328602 0.704764Task 7 y 4.782579 3.122943 3.250757Task 7 Angle 4.019318 9.898727 5.972128Task 8 x 3.96838 2.605622 0.309635Task 8 y 4.48907 8.452465 5.935727Task 8 Angle 2.614822 2.446015 4.147103Task 9 x 1.936099 2.672111 1.420598Task 9 y 1.317336 1.30716 9.593448Task 9 Angle 7.436803 2.853844 5.345833

Accuracy Participant 10 - Variation Participant 11 - Variation Participant 12 - VariationTask 1 x 13.446533 7.569635 5.559771Task 1 y 20.19087 19.628868 22.961542Task 1 Angle 42.893341 3.735723 10.378162Task 2 x 3.048069 7.805073 9.282025Task 2 y 2.37344 8.359359 4.647328Task 2 Angle 4.724213 7.302918 4.737778Task 3 x 7.642148 7.28321 8.163282Task 3 y 1.468308 2.374198 4.814531Task 3 Angle 7.206512 8.916786 1.834259Task 4 x 1.166884 8.794916 5.223243Task 4 y 1.104884 6.549114 5.482608Task 4 Angle 8.071105 3.3833 5.477127Task 5 x 5.71743 6.566397 0.830942Task 5 y 11.816035 4.73257 9.109616Task 5 3.547265 1.815139 7.547706

127

Page 138: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

AngleTask 6 x 9.061326 2.1272 2.571Task 6 y 4.953227 3.343865 8.618427Task 6 Angle 2.362732 7.303749 7.632049Task 7 x 6.951204 0.819095 3.643125Task 7 y 1.453997 5.919619 6.974047Task 7 Angle 7.377995 7.539597 2.552658Task 8 x 4.448137 5.329015 1.102177Task 8 y 2.06629 6.917297 0.182512Task 8 Angle 0.469849 5.334875 2.138718Task 9 x 6.050835 4.610652 8.218855Task 9 y 4.619998 8.748012 0.37383Task 9 Angle 4.635437 0.394714 7.936523

Accuracy Participant 13 - Variation Participant 14 - Variation Participant 15 - VariationTask 1 x 11.157458 10.919239 29.693233Task 1 y 27.891035 25.749081 42.769776Task 1 Angle 2.659973 0.92218 0.729692Task 2 x 0.028391 7.809859 4.064739Task 2 y 7.382892 2.951053 1.899705Task 2 Angle 5.315826 7.802017 12.42131Task 3 x 3.916131 3.233429 6.324694Task 3 y 3.178588 8.287995 0.073385Task 3 Angle 5.571777 0.700645 6.014999Task 4 x 2.653247 0.892778 2.825411Task 4 y 9.780528 7.015494 7.177656Task 4 Angle 4.648544 0.504275 4.779968Task 5 x 4.323891 10.543114 8.77073Task 5 y 7.428329 2.866711 7.51959Task 5 Angle 1.423861 2.263794 1.045472Task 6 x 2.83752 3.05858 1.708741Task 6 y 13.278838 2.362912 8.34893Task 6 Angle 3.216186 4.064606 1.836342Task 7 x 6.907535 4.355759 5.824256Task 7 y 0.310526 0.037548 5.623588Task 7 8.749832 7.445179 4.061065

128

Page 139: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

AngleTask 8 x 0.502259 7.744605 8.183574Task 8 y 7.537782 2.599366 1.008305Task 8 Angle 2.310099 4.585144 1.851043Task 9 x 5.958924 4.587892 5.269618Task 9 y 4.001875 0.108859 3.957533Task 9 Angle 4.524475 10.274788 0.311661

Accuracy Participant 16 - Variation Participant 17 - Variation Participant 18 - VariationTask 1 x 1.475482 19.473617 8.029748Task 1 y 31.047636 51.602575 40.233751Task 1 Angle 10.577271 9.168022 1.172787Task 2 x 1.638791 8.933812 0.231718Task 2 y 6.98595 1.131072 7.478341Task 2 Angle 2.858475 8.854126 6.110062Task 3 x 7.057989 8.341644 3.694768Task 3 y 9.368799 7.953215 6.782873Task 3 Angle 9.31987 1.247924 1.760849Task 4 x 6.666618 2.2159 5.268515Task 4 y 5.195172 4.507716 3.654681Task 4 Angle 3.486313 8.522421 4.303718Task 5 x 0.44915 1.404236 8.534449Task 5 y 6.218363 8.901302 0.981715Task 5 Angle 4.020464 9.977737 4.844101Task 6 x 2.204614 0.68721 6.332388Task 6 y 0.558016 4.065189 1.610421Task 6 Angle 2.515915 9.327111 5.218079Task 7 x 4.793839 4.390802 30.98502Task 7 y 1.887394 6.188506 4.064856Task 7 Angle 7.475998 9.324257 2.024742Task 8 x 2.390844 0.454762 8.606223Task 8 y 7.858291 1.69462 5.24689Task 8 Angle 8.492493 0.054131 7.51738Task 9 x 6.401455 2.571915 2.173138Task 9 y 5.20792 6.811522 7.391329Task 9 5.81691 8.874908 2.178864

129

Page 140: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Angle

Accuracy Participant 19 - Variation Participant 20 - Variation AverageTask 1 x 16.035428 2.702092 16.95507135Task 1 y 51.064596 44.845894 32.9813698Task 1 Angle 8.782784 4.186607 7.467037Task 2 x 7.704435 4.5574 5.9289797Task 2 y 8.028665 5.788502 4.6968567Task 2 Angle 3.944412 5.117477 6.90279785Task 3 x 8.44112 1.279486 5.12909055Task 3 y 9.64036 6.585169 4.2597224Task 3 Angle 9.473953 7.154785 5.95790975Task 4 x 2.493928 6.288715 3.62801035Task 4 y 0.299595 5.789072 5.15570525Task 4 Angle 2.008194 9.326233 4.63897325Task 5 x 5.023185 3.053279 4.28685585Task 5 y 6.12265 5.912205 4.4624268Task 5 Angle 1.060852 4.738373 4.31967165Task 6 x 6.744991 1.796916 4.96750145Task 6 y 0.87351 2.796306 5.22721875Task 6 Angle 0.245133 2.133651 5.26657725Task 7 x 3.442507 2.461362 4.638698Task 7 y 5.900677 2.636251 4.49938805Task 7 Angle 8.31797 6.570614 6.34023535Task 8 x 0.902094 9.961924 4.4657543Task 8 y 3.366543 2.524759 4.6197108Task 8 Angle 0.59491 1.888672 3.3184785Task 9 x 8.799026 7.559643 5.23354395Task 9 y 3.826679 7.266434 5.28991545Task 9 Angle 5.293354 7.58754 5.49502805

130

Page 141: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

User Study Survey Data

Scenario Preference Participant 1 Participant 2 Participant 3 Participant 4 Participant 5Scenario 1Scenario 2 1 1 1 1 1

Scenario 2Scenario 3Scenario 4 1 1 1 1 1

Scenario 2 1 1 1 1 1Scenario 5

Scenario 6 1 1 1 1Scenario 7 1

Scenario 8 1 1 1 1 1Scenario 9

Scenarios 8, 9 1 1 1Scenarios 6, 7 1 1

Scenario 2Scenario 3Scenario 4 1 1 1 1 1Scenario 6 1Scenario 7Scenario 8 1Scenario 9

Scenario Preference Participant 6 Participant 7 Participant 8 Participant 9 Participant 10Scenario 1 Scenario 2 1 1 1 1 1 Scenario 2 1 1Scenario 3 1 Scenario 4 1 1 Scenario 2 1 1 1 1 1

131

Page 142: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Scenario 5

Scenario 6 1 1 Scenario 7 1 1 Scenario 8 1 1 1 1Scenario 9 1

Scenarios 8, 9 1 1 1 1 1Scenarios 6, 7 Scenario 2 1Scenario 3 Scenario 4 1Scenario 6 Scenario 7Scenario 8 1 1 1 1Scenario 9

Scenario Preference Participant 11 Participant 12 Participant 13 Participant 14 Participant 15Scenario 1 1 Scenario 2 1 1 1 1 Scenario 2 1 1Scenario 3 Scenario 4 1 1 Scenario 2 1 1 1 1Scenario 5

Scenario 6 1 Scenario 7 1 1 Scenario 8 1 1 1 1Scenario 9

Scenarios 8, 9 1 1 1Scenarios 6, 7 1 1 Scenario 2

132

Page 143: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Scenario 3 Scenario 4 1 1 1Scenario 6 1 Scenario 7 1Scenario 8 1Scenario 9

Scenario Preference Participant 16 Participant 17 Participant 18 Participant 19 Participant 20Scenario 1 Scenario 2 1 1 1 1 1 Scenario 2Scenario 3 Scenario 4 1 1 1 1 1 Scenario 2 1 1 1 1 1Scenario 5

Scenario 6 1 1 1 1 1Scenario 7 Scenario 8 1 1 1 1 1Scenario 9

Scenarios 8, 9 1 1 1 1Scenarios 6, 7 1 Scenario 2Scenario 3 Scenario 4 1Scenario 6 1 Scenario 7Scenario 8 1 1 1Scenario 9

133

Page 144: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

User Study Intuitiveness Data

Rotation Instruction No. of ParticipantsCorrect 14Incorrect, but indicated a translation 1Incorrect 5

Translation Instruction No. of ParticipantsCorrect 16Incorrect 4

Circle Translation No. of ParticipantsCorrect 18Incorrect 2

Circle Rotation No. of ParticipantsCorrect 1Incorrect 18Incorrect, but indicated a rotation 1

134

Page 145: wiki.cis.unisa.edu.au€¦  · Web viewThis dissertation presents a study of the appropriate visual cues to instruct a user on how to arrange physical objects using Spatial Augmented

Pilot Study Data

Pilot Study Participant 1 Participant 2 Participant 3Manual Timer 17990 14657 15191SAR Timer 9994 6795 5062

Pilot Study Participant 4 Participant 5 Participant 6 AverageManual Timer 6928 12659 7996 12570.16667SAR Timer 1733 6796 3199 5596.5

Was the Revised Visual Cue Better? No. of ParticipantsYes 6No 0

135