restaurant scenario

9
Restaurant Scenario Ryan Lapp and Mike Smith StudioNEXT @ UArts ( Eye Tracker )

Upload: rachael-harr

Post on 19-Mar-2016

217 views

Category:

Documents


3 download

DESCRIPTION

Restaurant Scenario

TRANSCRIPT

Page 1: Restaurant Scenario

Restaurant ScenarioRyan Lapp and Mike Smith

StudioNEXT @ UArts ( Eye Tracker )

Page 2: Restaurant Scenario

Many modern restaurants strive to cre-ate a unique and memorable experience to attract consumers and develop cus-tomer loyalty. After researching consumer behavior in a restaurant setting, Team Iris has determined a number of opportuni-ties where eye tracking technology can be implemented to improve the dining expe-rience. These opportunities include menu navigation, menu item identification, flag-ging down the server, and simple retrieval of special information.

Page 3: Restaurant Scenario

College: B.S. MarketingHome: Burbank, CAInterests: Playing with his dog, skate-boarding, snowboarding, BMX, film, poker, the gymDining Expectations: Looking to go out with friends to a fun place with great food and drink

Customer Profile

Nick - Young Professional

Page 4: Restaurant Scenario

Toast

FIN

ALI

ZE

Breakfast

Dessert

Eggs

Lunch

Dinner

Toast

Breakfast

Dessert

Eggs

Lunch

Dinner FIN

ALI

ZE

Menu Navigation

Function Benefits Explanation of Function

Physical Form Technology: Existing or Proposed?

Activation Feedback User Isolation Issues

Menu Navigation

Exciting new way to interact with a menu

Intuitively adapts to the user’s natural tendencies

A menu projected onto the table can be explored using only gaze tracking.

Infrared emitting cameras are located at center of table. The number of cameras corresponds with the number of place settings. Eye tracking information is processed at a nearby computer and the dynamic menu is projected onto the surface of the table through a projector mounted on the ceiling above the table.

Existing Menu is opened when user looks directly into camera or at other target areas on the table. This provides an opportunity for initial calibration.

The feedback of this function presents itself when the relevant information is projected on the table. For example, if the user fixates on the “Soups and Salads” section of the menu, that section would expand to reveal the various menu items available for that section. As the user reads the description for one of the menu items, a large picture of that item is displayed next to the text. This picture changes appropriately as the user reads other item descriptions. The user will always have the option to focus on another section of the menu simply by fixating on the title of that section.

Sitting directly in front of the camera isolates the user. The eye tracking software can be programmed to ignore other pairs of eyes in the background.

Constant recalibration would be needed since the user’s head is not fixed in relation to the camera and menu. Testing will be required to refine the menu navigation process.

Page 5: Restaurant Scenario

RFID CHIP

I don’t knowwhat to eat. Whatare they eating at

that table?

Projection

Food appears here as a projection

That looks good!I think I’ll try that!

Menu Item Identification

Function Benefits Explanation of Function

Physical Form Technology: Existing or Proposed?

Activation Feedback User Isolation Issues

RFID for food and drink

Exciting new way to interact with the restaurant environment

Intuitively adapts to the user’s natural tendencies

Brings relevant information to a user, increasing sales

Special information is displayed on projected menu when a user fixates on actual menu items (dishes and beverages) located on any table or bar surface. System must detect which plate or glass user is fixating upon among all available options. System must detect origin of gaze and display information on the correct table.

System uses the same cameras located at center of table to estimate the object of a user’s fixation. If user fixates on a dish or beverage, the menu information for that product is projected onto the user’s table. The unique identity of each dish or beverage is determined through the use of RFID tags. Each dish or glass is equipped with it’s own RFID tag and once a dish or drink is made, the item’s RFID tag is matched with the identity of that menu item. When the item is brought to the table or bar, the item is automatically scanned using RFID scanners built-in to the table or bar. This identity is used to provide the correct information to a user’s table when that user fixates on that item.

Proposed This function will require each dish or drink to be updated individually with that item’s identity information, It will also require every table and bar to scan nearby items periodically to keep the database of items in the restaurant updated.

The feedback of this function presents itself when the relevant information is projected on the table.

Sitting directly in front of the camera isolates the user. The eye tracking software can be programmed to ignore other pairs of eyes in the background.

Assumes that table-mounted cameras are able to estimate the object of a user’s fixation based on the user’s gaze even though the user is not looking at the surface of the table. Would not function correctly if user’s eyes are not visible from the table camera (e.g., the user is looking over his or her shoulder).

Food appears here as a projection

Page 6: Restaurant Scenario

?

WAITRESS

Flagging Down the Server

Function Benefits Explanation of Function

Physical Form Technology: Existing or Proposed?

Activation Feedback User Isolation Issues

Flagging down server

Intuitively adapts to the user’s natural tendencies

Improves customer service and user experience, increasing customer loyalty and restaurant reputation

Server is notified when a user is looking for him/her and which table the request is coming from. System must detect origin of gaze and relay information about the correct table.

Infrared emitting cameras are located at fixed areas of interest (e.g., kitchen door, service station). When a user exhibits a pattern of glancing at these fixed trigger areas, the server is notified when a user is looking for him/her and which table the request is coming from.

OR

System uses the same cameras located at center of table to estimate the object of a user’s fixation.

Existing;Proposed

This function is always able to detect gazes. With the first method, the detecting cameras always have a fixed perspective, so they can easily estimate which table to display information on. The function executes after a significant pattern of gazes is detected (e.g., user looks over at the kitchen door, then over at the service station).

Ideally, the server would provide feedback by offering assistance shortly after being alerted to the user’s interest.

In the first method, users are indirectly isolated based on where they are sitting. The detecting cameras always have a fixed perspective, so they can easily estimate which table to display information on. In the second method, sitting directly in front of the camera isolates the user. The eye tracking software can be programmed to ignore other pairs of eyes in the background.

Second method assumes that table-mounted cameras are able to estimate the object of a user’s fixation based on the user’s gaze even though the user is not looking at the surface of the table. Would not function correctly if user’s eyes are not visible from the table camera (e.g., the user is looking over his or her shoulder).

Page 7: Restaurant Scenario

Other Functions

Function Benefits Explanation of Function

Physical Form Technology: Existing or Proposed?

Activation Feedback User Isolation Issues

Drink and meal specials, dessert displays

Exciting new way to interact with the restaurant environment

Intuitively adapts to the user’s natural tendencies

Brings relevant information to a user, increasing sales

Allows user to easily read information that may be difficult to read from across the room

Special information is displayed on projected menu when a user fixates on fixed trigger areas placed around the restaurant. System must detect origin of gaze and display information on the correct table.

Infrared emitting cameras are located at fixed areas of interest (e.g., food and drink specials boards, dessert displays). The number of cameras depends on the desired detection angle (a central dessert display that can be viewed from 360 degrees would require more cameras than a specials board mounted on the wall that can be viewed from one side only). When a user fixates on one of these areas, relevant information (contents of special menus, dessert menu) is projected onto the user’s table.

OR

System uses the same cameras located at center of table to estimate the object of a user’s fixation. If user fixates on a specials board or dessert display, the appropriate information is projected onto the user’s table.

Existing;Proposed

This function is always able to detect gazes.

The feedback of this function presents itself when the relevant information is projected on the table.

In the first method, users are indirectly isolated based on where they are sitting. The detecting cameras always have a fixed perspective, so they can easily estimate which table to display information on. In the second method, sitting directly in front of the camera isolates the user. The eye tracking software can be programmed to ignore other pairs of eyes in the background.

Second method assumes that table-mounted cameras are able to estimate the object of a user’s fixation based on the user’s gaze even though the user is not looking at the surface of the table. Would not function correctly if user’s eyes are not visible from the table camera (e.g., the user is looking over his or her shoulder).

Page 8: Restaurant Scenario

Comparable ServicesInamo

Features:Interactive menu projected onto tableTouchscreen interface

Page 9: Restaurant Scenario

Comparable ServicesuWink

Features:Interactive menus on individual monitorsTouchscreen interface