magic bench a multi-user & multi-sensory ar/mr...

2
Magic Bench — A Multi-User & Multi-Sensory AR/MR Platform Kyna McIntosh John Mars James Krahe Jim McCann Alexander Rivera Jake Marsico Ali Israr Shawn Lawson Moshe Mahler Disney Research Figure 1: A selection of screenshots from a Magic Bench demo. ABSTRACT Mixed Reality (MR) and Augmented Reality (AR) create exciting opportunities to engage users in immersive experiences, resulting in natural human-computer interaction. Many MR interactions are generated around a rst-person Point of View (POV). In these cases, the user directs to the environment, which is digitally displayed either through a head-mounted display or a handheld computing device. One drawback of such conventional AR/MR platforms is that the experience is user-specic. Moreover, these platforms require the user to wear and/or hold an expensive device, which can be cumbersome and alter interaction techniques. We create a solution for multi-user interactions in AR/MR, where a group can share the same augmented environment with any com- puter generated (CG) asset and interact in a shared story sequence through a third-person POV. Our approach is to instrument the environment leaving the user unburdened of any equipment, creat- ing a seamless walk-up-and-play experience. We demonstrate this technology in a series of vignees featuring humanoid animals. Participants can not only see and hear these characters, they can also feel them on the bench through haptic feedback. Many of the characters also interact with users directly, either through speech or touch. In one vignee an elephant hands a participant a glowing orb. is demonstrates HCI in its simplest form: a person walks up to a computer, and the computer hands the person an object. CCS CONCEPTS Human-centered computing Mixed / augmented reality; Haptic devices; Computing methodologies Mixed / aug- mented reality; KEYWORDS Mixed Reality, Augmented Reality, Haptics, Real-time Compositing, Immersive Experiences Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for prot or commercial advantage and that copies bear this notice and the full citation on the rst page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). SIGGRAPH 2017, Los Angeles, CA, USA © 2017 Copyright held by the owner/author(s). 123-4567-24-567/17/06. . . $15.00 DOI: 10.475/123 4 1 IMPLEMENTATION We create a 3D reconstruction of a scene using a combination of the depth and color sensors on an o-the-shelf Microso Kinect. To do this, we draw polygons using each point in the point cloud as a vertex, creating the appearance of a solid mesh. e mesh is then aligned to the RGB camera feed of the scene from the same Kinect. is alignment gives the mesh color, and completes a 3D reconstructed video feed. ere are several problems that arise with the 3D constructed feed. First, the monocular feed creates “depth shadows” in areas where there is no direct line-of-sight to the depth sensor. Second, the depth camera is laterally oset from the RGB camera (since they cannot physically occupy the same space) and therefore have slightly dierent viewing angles, creating further depth shadowing. e resulting data feed is sparse and cannot represent the whole scene (see Figure 3. To solve this, we align the 3D depth feed with the 2D RGB feed from the Kinect. By compositing the depth feed over a 2D backdrop, the system eectively masks these depth shadows, creating a seamless composite that can then be populated with 3D CG assets. is mixed reality platform centers around the simple seing of a bench. e bench works in an novel way to constrain a few problems, such as identifying where a user is and subsequently inferring the direction of the users gaze (i.e., toward the screen). It creates a stage with a foreground and background, with the bench occupants in the middle ground. e bench also acts as a controller; the mixed reality experience won’t trigger until at least one person is detected siing on the bench. Further, dierent seating formations on the bench trigger dierent experiences. Magic Bench is a custom Soware and custom Hardware plat- form, necessitating a solution to bridge both aspects. Between the two exists a series of patches created in Cycling ’74 Max designed to convert signals sent from the game engine (via OSC) about the po- sitions and states of objects in the scene, into the haptic sensations felt on the bench. Haptic actuators are dynamically driven based on the location of animated content. e driving waveform for each actuator is designed according to the desired feel — in the current setup we can tweak base frequency, frequency of modulation, general amplitude, amplitude envelope, and three-dimensional position. 1

Upload: others

Post on 21-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Magic Bench A Multi-User & Multi-Sensory AR/MR Platformdisneyresearch.s3-us-west-1.amazonaws.com/wp...Mixed Reality (MR) and Augmented Reality (AR) create exciting opportunities to

Magic Bench — A Multi-User & Multi-Sensory AR/MR PlatformKyna McIntosh John Mars James Krahe Jim McCann Alexander Rivera Jake Marsico

Ali Israr Shawn Lawson Moshe MahlerDisney Research

Figure 1: A selection of screenshots from a Magic Bench demo.

ABSTRACTMixed Reality (MR) and Augmented Reality (AR) create excitingopportunities to engage users in immersive experiences, resultingin natural human-computer interaction. Many MR interactions aregenerated around a �rst-person Point of View (POV). In these cases,the user directs to the environment, which is digitally displayedeither through a head-mounted display or a handheld computingdevice. One drawback of such conventional AR/MR platforms isthat the experience is user-speci�c. Moreover, these platformsrequire the user to wear and/or hold an expensive device, whichcan be cumbersome and alter interaction techniques.

We create a solution formulti-user interactions in AR/MR, wherea group can share the same augmented environment with any com-puter generated (CG) asset and interact in a shared story sequencethrough a third-person POV. Our approach is to instrument theenvironment leaving the user unburdened of any equipment, creat-ing a seamless walk-up-and-play experience. We demonstrate thistechnology in a series of vigne�es featuring humanoid animals.Participants can not only see and hear these characters, they canalso feel them on the bench through haptic feedback. Many of thecharacters also interact with users directly, either through speechor touch. In one vigne�e an elephant hands a participant a glowingorb. �is demonstrates HCI in its simplest form: a person walksup to a computer, and the computer hands the person an object.

CCS CONCEPTS•Human-centered computing→Mixed / augmented reality;Haptic devices; •Computing methodologies→ Mixed / aug-mented reality;

KEYWORDSMixed Reality, Augmented Reality, Haptics, Real-time Compositing,Immersive Experiences

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor pro�t or commercial advantage and that copies bear this notice and the full citationon the �rst page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).SIGGRAPH 2017, Los Angeles, CA, USA© 2017 Copyright held by the owner/author(s). 123-4567-24-567/17/06. . .$15.00DOI: 10.475/123 4

1 IMPLEMENTATIONWe create a 3D reconstruction of a scene using a combination ofthe depth and color sensors on an o�-the-shelf Microso� Kinect.To do this, we draw polygons using each point in the point cloudas a vertex, creating the appearance of a solid mesh. �e mesh isthen aligned to the RGB camera feed of the scene from the sameKinect. �is alignment gives the mesh color, and completes a 3Dreconstructed video feed.

�ere are several problems that arise with the 3D constructedfeed. First, the monocular feed creates “depth shadows” in areaswhere there is no direct line-of-sight to the depth sensor. Second,the depth camera is laterally o�set from the RGB camera (sincethey cannot physically occupy the same space) and therefore haveslightly di�erent viewing angles, creating further depth shadowing.�e resulting data feed is sparse and cannot represent the wholescene (see Figure 3. To solve this, we align the 3D depth feedwith the 2D RGB feed from the Kinect. By compositing the depthfeed over a 2D backdrop, the system e�ectively masks these depthshadows, creating a seamless composite that can then be populatedwith 3D CG assets.

�is mixed reality platform centers around the simple se�ingof a bench. �e bench works in an novel way to constrain a fewproblems, such as identifying where a user is and subsequentlyinferring the direction of the user�s gaze (i.e., toward the screen).It creates a stage with a foreground and background, with thebench occupants in the middle ground. �e bench also acts asa controller; the mixed reality experience won’t trigger until atleast one person is detected si�ing on the bench. Further, di�erentseating formations on the bench trigger di�erent experiences.

Magic Bench is a custom So�ware and custom Hardware plat-form, necessitating a solution to bridge both aspects. Between thetwo exists a series of patches created in Cycling ’74Max designed toconvert signals sent from the game engine (via OSC) about the po-sitions and states of objects in the scene, into the haptic sensationsfelt on the bench.

Haptic actuators are dynamically driven based on the locationof animated content. �e driving waveform for each actuatoris designed according to the desired feel — in the current setupwe can tweak base frequency, frequency of modulation, generalamplitude, amplitude envelope, and three-dimensional position.

1

Page 2: Magic Bench A Multi-User & Multi-Sensory AR/MR Platformdisneyresearch.s3-us-west-1.amazonaws.com/wp...Mixed Reality (MR) and Augmented Reality (AR) create exciting opportunities to

Figure 2: Flowchart of the Magic Bench installation.

Figure 3: Reconstruction of the scene within the game en-gine.

�ese parameters can be manually tuned and/or adjusted in realtime.

2 INSTALLATION OPTIONS�is piece can run as a traditional VR Village installation or asan autonomous piece in an unsuspecting area at SIGGRAPH —imagine si�ing on a bench to rest your feet or check your email; infront of you is a screen showing a SIGGRAPH showreel. Once thesystem detects you, the content switches to a video feed of you,creating a mirror e�ect. From there, an unexpected AR experienceunfolds.

2