lecture 9: ai, magic and deception

52
1 Lecture 9: AI, magic and deception Adaptive Robotics 2008

Upload: mairi

Post on 23-Feb-2016

31 views

Category:

Documents


0 download

DESCRIPTION

Lecture 9: AI, magic and deception. Adaptive Robotics 2008. Assignment feedback. Read the question! Try and answer the question – be explicit about the link between what you cover and how it relates to the question Structure your argument – e.g. with introduction and conclusion - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Lecture 9: AI, magic and deception

1

Lecture 9: AI, magic and deception

Adaptive Robotics 2008

Page 2: Lecture 9: AI, magic and deception

2

Assignment feedback

Read the question!Try and answer the question – be explicit about the link between what you cover and how it relates to the questionStructure your argument – e.g. with introduction and conclusionStart by saying what you are going to sayFinish by reflecting back on what you have said.

Page 3: Lecture 9: AI, magic and deception

3

Research – Wikipedia versus published sourcesInclude material from the course – show that you know and have understood it.

Page 4: Lecture 9: AI, magic and deception

4

Mark range 45-78%

Abstract – summarises whole essay, or journal paper.

Page 5: Lecture 9: AI, magic and deception

5

Abstracte.g.“There have been many different approaches

to robotics, two of which include the more recent behaviour-based robotics and good old fashioned AI. The characteristics of each approach vary and they both have advantages and disadvantages depending on the overall purpose of the robot. These characteristics of these two approaches will be explored and contrasted”

Page 6: Lecture 9: AI, magic and deception

6

“Robots in the news”

Robot play at Osaka University“Hataraku Watashi” (I, worker)Robots speak lines, and share stage with humansAbout a housekeeping robot that loses its motivation to work.Wakamaru robot (Mitsubishi)

Page 7: Lecture 9: AI, magic and deception

7

Lecture 9: AI, magic and deception

Human-Robot interactionAttempts to make humanoid robots, or convincing robot pets“Android science” (Karl MacDorman)Robots creating the illusion of life and animacy

Factors to exploit:Interest in technologyHuman tendency to anthropomorphismHuman tendency to zoomorphism“Darwinian buttons”See early examples … up to recent examples.See some experiments on HRI understanding what affects our interactions.

Should we do this? Class discussion …..

Page 8: Lecture 9: AI, magic and deception

8

Deception and AIELIZA – creating the illusion of understandingAutomata – creating the illusion of life“Android Science” – creating robots with human appearance.

Page 9: Lecture 9: AI, magic and deception

9

Vaucanson’s duck

Created 1739 by Jacques VaucansonAppeared to eat kernels of grain, digest and defecate(but pellets inserted into duck’s anus)

Page 10: Lecture 9: AI, magic and deception

10

Chess playing automaton: The Turk

Constructed in 1769 by Baron von Kemplen for Austrian-Hungarian empress Maria TheresaPlayed a strong game of chess against many human opponents – including Benjamin Franklin and Napoleon over 80 year period

Page 11: Lecture 9: AI, magic and deception

11

Page 12: Lecture 9: AI, magic and deception

12

Gakutensoku(learning from the laws of nature)Built in 1929 by Makoto Nishimura for celebrations of ascension

of Emperor Hirohito to his throne.Could smile, move eyes, cheeks and chest, and move a pen.Worked by forcing compressed air through hidden rubber tubesSeated behind desk People would remove their hats and pray to it.

Page 13: Lecture 9: AI, magic and deception

13

Westinghouse robots

1927 Roy James Wensley and TelevoxNew mechanism for controlling electrical substationsPreviously – controller would phone worker in substation and tell them which switch to open. Worker would open switch and report back.New idea – replacing worker with bank of relay switches that could be operated by calling them on the phone3 tones from tuning forks directed to phoneAt receiving end, tones amplified to operate relay.

Page 14: Lecture 9: AI, magic and deception

14

“Televox”

Wensley’s machine consisted of 2 boxes of electronicsWestinghouse publicity team – branded it a “mechanical man”Wensley added head, body and limbs made from prop boardStory spread rapidly“The club woman with Televox in her home may call up at five o’clock, inquire of Televox what the temperature in the living room is, have Televox turn up the furnace, light the oven in which she has left the roast, light the lamp in the living room, and do whatever else she may wish. Televox comes near to being a scientist’s realization of a dramatist’s fantasy.” (1928)

Page 15: Lecture 9: AI, magic and deception

15

Page 16: Lecture 9: AI, magic and deception

16

Page 17: Lecture 9: AI, magic and deception

17

“American engineer H. J. Wensley of Westinghouse laboratories just created a robot which he named “Televox” because it follows directions remotely from voice commands or sounds of a musical instrument. The vibrations trigger an electric motor in the robot which makes it act according to the commands received.… This bewildering being is the most striking design of our mechanical time, whose creations, having neither sense nor brain, achieve a perfection that truly appears to approach the supernatural.”Vu magazine, 1928

Page 18: Lecture 9: AI, magic and deception

18

Other Westinghouse robots

Katrina TelevoxWillie Vocalite – smoked cigarettes

Controlled by instructions spoken into telephone – different responses triggered by number of syllables

ElektroA 7 ft walking robot, remote controlled by voice

commands

Sparko A dog for Elektro

Page 19: Lecture 9: AI, magic and deception

19

Willie Vocalite

Page 20: Lecture 9: AI, magic and deception

20

Elektro

Page 21: Lecture 9: AI, magic and deception

21

Page 22: Lecture 9: AI, magic and deception

22

Page 23: Lecture 9: AI, magic and deception

23

In these examples, no attempt to model humans, or animalsNo attempt to make the mechanisms underlying their behaviour the same as those of humans, or animalsAim instead is to create an illusion

Of lifeOf understandingAlso can serve to advertise a company

E.g. Westinghouse robotsE.g. Asimo and Honda

Page 24: Lecture 9: AI, magic and deception

24

Factors making AI magic and deception easier

Humans have a natural tendency to anthropomorphise machines

E.g. talking to your car, or your computer as if it could understand you, and as if it could choose to behave well or notE.g. seeing faces in inanimate objects

Page 25: Lecture 9: AI, magic and deception

25

Anthropomorphism – attributing human characteristics to non-human creatures

Zoomorphism – attributing animal characteristics to non-animals

Page 26: Lecture 9: AI, magic and deception

26

Factors making AI magic and deception easier

“willing suspension of disbelief”Exploited by puppeteers(see Heart robot)See children with their favourite teddy bear

Page 27: Lecture 9: AI, magic and deception

27

Heart Robot

Developed at University of West of EnglandDesigned to encourage emotional responsesPart robot, part puppet – operated by expert puppeteerRobot appears to respond emotionally to human encounters- when hugged and treated gently its limbs become limp, eyelids lower, breathing relaxes and heart beat slows down.If shaken or shouted at, it flinches, clenches its hand, breathing and heart rate speed up and its eyes widen in dismay.

Page 28: Lecture 9: AI, magic and deception

28

Page 29: Lecture 9: AI, magic and deception

29

Factors making AI magic and deception easier

Sherry Turkle (2006) talks of how robots, or toys, that seem to need nurturing and care “push our Darwinian buttons”

Page 30: Lecture 9: AI, magic and deception

30

Paro and My Real Baby

Therapeutic seal and Interactive dollTurkle et al (2006) studied elderly care-home resident’s interactionsMethod: observation and conversations with technology users

Page 31: Lecture 9: AI, magic and deception

31

Turkle (1995) notes a tendency among both children and adults to treat computer artifacts that are minimally responsive as more intelligent than they really are:

“Very small amounts of interactivity cause us to project our own complexity onto the undeserving object”

e.g. Tamagotchi phenomena.

Page 32: Lecture 9: AI, magic and deception

32

Page 33: Lecture 9: AI, magic and deception

33

Kismet

Cynthia Breazeal, MITSherry Turkle (2006) looked at children interacting with Kismet and Cog – Found they preferred to see Kismet as something with which they could have a relationship.They would develop elaborate explanations for Kismet’s failures to understand, or to respond appropriately.E.g “Kismet is too shy” “Kismet is not feeling well”

Page 34: Lecture 9: AI, magic and deception

34

Human-Robot interaction

What creates an illusion of intelligence?What kind of robot do people prefer to interact with?

Humanoid? Furry? Friendly?Eye contact, turn taking

Page 35: Lecture 9: AI, magic and deception

35

Uncanny valley

Page 36: Lecture 9: AI, magic and deception

36

Uncanny valley

Japanese roboticist Masahiro Mori wrote about the uncanny valley in 1970.Mori's hypothesis states that as a robot is made more humanlike in its appearance and motion, the emotional response from a human being to the robot will become increasingly positive and empathic, until a point is reached beyond which the response quickly becomes that of strong repulsion.

Page 37: Lecture 9: AI, magic and deception

37

Page 38: Lecture 9: AI, magic and deception

38

Total Turing Test?

Ishiguro, 2006Factors affecting our acceptance of robots as social partnersAndroid in booth viewed for 1 or 2 seconds

Static androidMoving android (micro movements)Real human

Task – check colour of cloth80% aware of android in static condition76.9% unaware of android in moving condition

Page 39: Lecture 9: AI, magic and deception

39

Factors encouraging human-robot interaction

AppearanceMovementEmotional expression

Page 40: Lecture 9: AI, magic and deception

40

Factors encouraging human-robot interaction

Conversation – e.g. turn taking, nodding encouraginglyEye contactContingency – responding quickly enough

Sometimes Wizard of Oz approach usedRecognizing and responding to your emotion

Face? Voice? Body language?

Page 41: Lecture 9: AI, magic and deception

41

Rubi/Qrio project

UCSD (University of California, San Diego)Studied interactions between children and QRIO robot in day care centre over 5 months

Page 42: Lecture 9: AI, magic and deception

42

Page 43: Lecture 9: AI, magic and deception

43

Measured the interactions between toddlers and robots – they interacted with QRIO more than with toy robot, or teddy bear.The robot responded contingently – e.g. giggling when patted on the headPart Wizard of Oz (hidden operator)

Their interactions decreased in a middle period where the robot performed a preset, but elaborate dance.

Page 44: Lecture 9: AI, magic and deception

44

“Results indicate that current robot technology is surprisingly close to achieving autonomous bonding and socialization with human toddlers for sustained periods of time. “ (Tanaka et al, 2007)

But the toddlers’ interactions were supervised and guided by other adultsAlso interaction times limited (1 hour sessions)And some remote control

Page 45: Lecture 9: AI, magic and deception

45

Social implications?- making robots appear intelligent?- making them seem to care “I love you”- using them as companions

Page 46: Lecture 9: AI, magic and deception

46

Hello Kitty robot

Website claims: "This is a perfect robot for whoever does not have a lot time to stay with their child. Hello Kitty Robot can help you to stay with your child to keep them from being lonely."

Page 47: Lecture 9: AI, magic and deception

47

PaPeRo robot

Page 48: Lecture 9: AI, magic and deception

48

Robot companions and carers for the elderly, or the very young- are they a good thing?

Discussion in pairs, then 4s etc, then report back advantages and disadvantages.

Page 49: Lecture 9: AI, magic and deception

49

The March of the Robot DogsSparrow (2002)

Robot “pets” suggested as companions for the elderlyThere are some demonstrable benefitsBut “For an individual to benefit significantly from ownership of a robot pet they must systematically delude themselves regarding the real nature of their relation with the animal.” (Sparrow, 2002)

Page 50: Lecture 9: AI, magic and deception

50

Living animal pets can share experiences with us.Sparrow argues that it’s right to value our relationships with themBut a robot is not something we can have a relationship with.To think otherwise is to be deluded.Morally wrong to delude old people into thinking they can have a relationship with a robot petAlso old people need human contact – the more robots are used in their care, the less human contact they will receive.

Page 51: Lecture 9: AI, magic and deception

51

Summary

Robots and illusion of animacyHuman tendencies to draw on

AnthropomophismCare taking

Tour of robots and automataEarly – Vaucanson’s duck, Televox and Westinghouse robotsMore recent – Asimo, Heart robot, Kismet

Factors affecting illusion – Human-Robot interactionEmotional expressionsPhysical appearance (uncanny valley)Contingency of response

Turn takingEye contact

Advantages and disadvantages of robot companions

Page 52: Lecture 9: AI, magic and deception

52

Feedback

ContentOrganisation/ArgumentWriting