novel interaction paradigms & multimodality - mycourses · • unity, opengl, game engines. ......

65
Novel Interaction Paradigms & Multimodality David McGookin

Upload: vokhanh

Post on 21-Aug-2018

228 views

Category:

Documents


0 download

TRANSCRIPT

Novel Interaction Paradigms & Multimodality

David McGookin

Introduction• Existing UI is

graphical and touch/pointer focused

• In the future this is only 1 solution

• Not always the best, or even possible

• What can we use to build more engaging/fun/suitable UI

A Model of HCI

Gulf of Execution

Gulf of Evaluation

A Computer Model of the World

What’s the Difference?

Have we changed? Or just our understanding of technology?

Or had we just not figured out how to build tech for the world?

Existing Paradigms don’t work (well)

• WIMP

WIMP & Post-WIMP

Leads to New Paradigms• Virtual Reality (VR)

• Augmented Reality (AR)

• Tangible User Interfaces (TUIs)

• Organic User Interfaces (OUIs)

• Embedded Environments

• Wearable paradigms

Virtual Reality (VR)• Build a virtual world that a user can feel both immersed and present

in.

• Old. Really Old. Late 80’s - Early 90’s

• Not really a current research field

• Key principles

• Virtual World

• Immersion

• Sensory Feedback

• Interactivity

Virtual World• The world is generated from a computer model.

• The computer directly manipulates that world

• There are rules for objects and features of the world and how the user can interact with them

• Multiple technologies

• Unity, OpenGL, Game Engines

Immersion & Presence• Immersion

• How encompassing is the experience

• E.g. Immersion in a good book

• Presence

• The sensation of actually being there in the virtual world

• Do the gaps show?

• Not necessarily just visual

Sensory Feedback• Not just a passive experience.

• Need a closed feedback loop between the environment and user.

• Actions by the user need immediate and continuous feedback

• Not necessarily just visual either.

• Multisensory (audio, tactile, smell, wind, temperature etc etc etc.)

Interactivity

• Not a passive experience, can manipulate the world, change it alter it.

• Because it is virtual we can go beyond the expected. Magical Superpowers for example.

• All these need high performance computers and high quality tracking of users

VR Technology• Old work was for fully immersive environments

• Cave Automatic Virtual Environment (CAVE)

• Still used in military and other specific examples

• Modern examples in Games and Entertainment

• Much cheaper technology

• Oculous Rift, Kinect etc.

• But 2 specific (non-gaming examples)

The Haptic Cow

The Haptic Cow

• Many cows in Britain

• Produce milk and food

• Need examinations and care from veterinarians

• Veterinarians need to be trained to carry out these examinations and diagnose conditions

The Haptic Cow

• But.....

(c) three feet @ flickr

The Haptic Cow

• Many Examinations are internal via back passage

• Many Examinations fingertip based

• Not possible for teacher to know what student is doing.

• E.g. Are they touching the correct area, detecting the condition etc.

The Haptic Cow

• This is a problem

• Cows cost money

• Only a limited number of exams can be done

• and only on the cows available

The Haptic Cow

• Could mean that the first time a student performs an examination is for real!

The Haptic Cow

Internal Examination Simulator

Uses a PHANTOM Device (the expensive one)

Allows various conditions to be simulated

Instructor can see what is being done

And with what force(c) Royal Veterinary College

Phantom Device• (Since I didn’t mention it earlier)

• ~25000 Euro

The Haptic Cow

• Asked surgeons to compare accuracy

• Found to be “adequate”

• Survey of vet students found performance increased

• Quantitive comparison with real palpation

Haptic Turk

Lung-Pan Cheng, Patrick Lühne, Pedro Lopes, Christoph Sterz, and Patrick Baudisch. 2014. Haptic turk: a motion platform based on people. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 3463-3472. DOI=10.1145/2556288.2557101 http://doi.acm.org/10.1145/2556288.2557101

Virtual Reality• Good for (certain types of games)

• Good for simulation

• Environments that are dangerous

• Environments that can be very different or need to be different

• Cheaper than the real world

Augmented Reality• The addition of computer generated media into the

real world environment, allowing real objects to be augmented with virtual information

• Work stared around 1970s.

• A harder problem in many ways than VR with significant challenges

• Key requirements for VR remain

• +1 new one

Augmented Reality• Need to know where the user is in the environment.

• VR only needs to know the user has moved and the direction.

• Hard problem

• Ground Truth

• Ubiquitous location sensing technology.

• E.g. GPS

Ground Truth• Virtual objects are embedded in the “real world”

• There is a ground truth where they should be

• Need to know where the user is in the world as reference to render the virtual world onto

• No ubiquitous sensing tech to do this.

• Needs multiple sensors

• Subject to error and drift

• Sensor Fusion algorithms to merge data

• But still need to deal with error

AR Technologies• Smartphones were first for consumers in 2010

(ish)

• Camera, GPS, Orientation Sensor

• Steve Mann

Google Glass

• Big Hype!

• Will it see real acceptance?

• Not the only way of doing AR

DigiGraff

David K. McGookin, Stephen A. Brewster, and Georgi Christov. 2014. Studying digital graffiti as a location-based social network. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 3269-3278. DOI=10.1145/2556288.2557266 http://doi.acm.org/10.1145/2556288.2557266

Audio-AR• Doesn’t need to be visual at all

• Can use a 3D sound environment geographically correlated to physical environment.

• Useful for stories or interactive experiences

• Heads-Up interaction and for Disabled users

Virtual Excavator

David McGookin, Yolanda Vazquez-Alvarez, Stephen Brewster, and Joanna Bergstrom-Lehtovirta. 2012. Shaking the dead: multimodal location based experiences for un-stewarded archaeological sites. In Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design (NordiCHI '12). ACM, New York, NY, USA, 199-208. DOI=10.1145/2399016.2399048 http://doi.acm.org/10.1145/2399016.2399048

Augmented Reality

• Useful way to get virtual data into the real world

• Good for a lot of geo-tagged data

• What kinds of applications would it be good for?

Tangible User Interfaces• Tangible Bits - Ishii & Ulmer (1997)

• We build virtual versions of real things (e.g. buttons).

• Instead augment real things with computational powers.

• Manipulate the world through manipulating physical objects.

metaDesk

Brygg Ullmer and Hiroshi Ishii. 1997. The metaDESK: models and prototypes for tangible user interfaces. In Proceedings of the 10th annual ACM symposium on User interface software and technology (UIST '97). ACM,

New York, NY, USA, 223-232. DOI=10.1145/263407.263551 http://doi.acm.org/10.1145/263407.263551

Tangible Graph Builder

David McGookin, Euan Robertson, and Stephen Brewster. 2010. Clutching at straws: using tangible interaction to provide non-visual access to graphs. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, New York, NY, USA, 1715-1724. DOI=10.1145/1753326.1753583 http://doi.acm.org/10.1145/1753326.1753583

I/O Brush

musicBottles

Not-Only 1 Way

Tangible User Interfaces• Tangible User Interfaces originally used trackable tabletop

surfaces (and still do)

• markers on phicons (e.g. ARToolkit)

• Image recognition with cameras

• But they are not limited to tables

• More recent work has moved to “magical” properties

• Make physical things do things they can’t otherwise do

• Play with the physical / digital boundary in interesting ways.

Tangible User Interfaces

• What do you think the advantages are over a standard GUI?

• Multiple Users?

• Understandability for Novice Users?

• Fun?

Organic User Interfaces/Shape Changing Interfaces

• Vertegaal & Poupyrev (2008) introduced OUIs

• Consideration of e-ink displays and how we would interact with them.

• How do we interact with them?

• Covers the area of flexible user interfaces and displays that can mould to any shape.

• E.g. car dashboards, flexible mobile devices

• Some comercial prototypes

• Nokia Kinect

Nokia Kinect

Paper Phone

Lahey, B., Girouard, A., Burleson, W. and R. Vertegaal. PaperPhone: Understanding the Use of Bend Gestures in Mobile Devices with Flexible Electronic Paper Displays. In Proceedings of ACM CHI’11 Conference on Human Factors in Computing Systems, ACM Press, 2011, pp. 1303-1312.

Shape Changing UI• Physical properties of the device change to reflect

device state or notification

• Can be either visual (e.g. the display bends)

• Or in some other modality

• Shape, weight, size etc.

• An extension of the tangible work we saw earlier

• Arguments how these relate to Tangibles

Intimate Mobiles

Intimate Mobiles• Hemmert et al. have done a lot of work in OUI for

Mobiles

• His work considers how organic elements can inform UI design

• E.g. Weight shift for direction or kissing and blowing for emotional feedback

• Note all of this is prototyped, with all the issues that prototyping brings for saying how it will be used!

Physical Interaction

• What are the benefits and drawbacks of this style of interaction?

• What are the issues in trying to realise it?

• What kind of places do you think it might be useful in?

Embedded Environments• Logical extension is to directly embed interaction

abilities into the environment.

• Projectors - on all walls

• Pervasive audio systems

• How do we interact with these, and what can we do?

• As relates to tangible sensing.

• Many of the same techniques for AR and VR

GravitySpace

Alan Bränzel, Christian Holz, Daniel Hoffmann, Dominik Schmidt, Marius Knaust, Patrick Lühne, René Meusel, Stephan Richter, and Patrick Baudisch. 2013. GravitySpace: tracking users and their poses in a smart room using a pressure-sensing floor. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 725-734. DOI=10.1145/2470654.2470757 http://doi.acm.org/10.1145/2470654.2470757

GravitySpace• BAT - Big Ass Table

• Track users as markers

• sneaker sole

• Straws in furniture

• Infer position of occupants

• Why?

Wearbles

• If computation is disappearing into the walls, then what of the mobile phone?

• Two ways to think

• Nearterm

• Longterm

Nearterm• Smartwatches,

Google Glass, FitBits, Nike+

• What do you think of these?

• What are they good for?

Duet

Xiang 'Anthony' Chen, Tovi Grossman, Daniel J. Wigdor, and George Fitzmaurice. 2014. Duet: exploring joint interactions on a smart phone and a smart watch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 159-168. DOI=10.1145/2556288.2556955 http://doi.acm.org/10.1145/2556288.2556955

LongTerm• Computation is embedded, we can remotely

sense things.

• Clothing, injections, implants

• Can lead to using the body as an input or computation device.

• To store data

• For input and output.Google Smart Contact Lens

BodySpace

Skinput

Chris Harrison, Desney Tan, and Dan Morris. 2010. Skinput: appropriating the body as an input surface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, New York, NY, USA, 453-462. DOI=10.1145/1753326.1753394 http://doi.acm.org/10.1145/1753326.1753394

Wearables

• What is your view of wearable computing?

• What is it for?

• What are the issues in adoption?

Some Takeaways• We’ve indirectly covered multimodal and multi

sensory interaction

• Gestures, Audio, Haptics become more important to supporting interaction

• There is disconnect between the sensory system and our ability to stimulate it.

• Needs more fundamental work to understand both the sensory system and ability to stimulate it

Disclaimer

Smell• Something we are working on in Aalto

• Trying to understand how we can employ scent in interaction

• Involved in Episodic Memory

• Provides Emotional Response

• But very little understood to stimulate

Conclusions • We’ve tried to cover a number of paradigms of

emerging User Interfaces

• Some have been around longer than others

• All meet at some point.

• Not really covered how these are done.

• But most can be done with regular technology