magic bench a multi-user & multi-sensory ar/mr...
TRANSCRIPT
Magic Bench — A Multi-User & Multi-Sensory AR/MR PlatformKyna McIntosh John Mars James Krahe Jim McCann Alexander Rivera Jake Marsico
Ali Israr Shawn Lawson Moshe MahlerDisney Research
Figure 1: A selection of screenshots from a Magic Bench demo.
ABSTRACTMixed Reality (MR) and Augmented Reality (AR) create excitingopportunities to engage users in immersive experiences, resultingin natural human-computer interaction. Many MR interactions aregenerated around a �rst-person Point of View (POV). In these cases,the user directs to the environment, which is digitally displayedeither through a head-mounted display or a handheld computingdevice. One drawback of such conventional AR/MR platforms isthat the experience is user-speci�c. Moreover, these platformsrequire the user to wear and/or hold an expensive device, whichcan be cumbersome and alter interaction techniques.
We create a solution formulti-user interactions in AR/MR, wherea group can share the same augmented environment with any com-puter generated (CG) asset and interact in a shared story sequencethrough a third-person POV. Our approach is to instrument theenvironment leaving the user unburdened of any equipment, creat-ing a seamless walk-up-and-play experience. We demonstrate thistechnology in a series of vigne�es featuring humanoid animals.Participants can not only see and hear these characters, they canalso feel them on the bench through haptic feedback. Many of thecharacters also interact with users directly, either through speechor touch. In one vigne�e an elephant hands a participant a glowingorb. �is demonstrates HCI in its simplest form: a person walksup to a computer, and the computer hands the person an object.
CCS CONCEPTS•Human-centered computing→Mixed / augmented reality;Haptic devices; •Computing methodologies→ Mixed / aug-mented reality;
KEYWORDSMixed Reality, Augmented Reality, Haptics, Real-time Compositing,Immersive Experiences
Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor pro�t or commercial advantage and that copies bear this notice and the full citationon the �rst page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).SIGGRAPH 2017, Los Angeles, CA, USA© 2017 Copyright held by the owner/author(s). 123-4567-24-567/17/06. . .$15.00DOI: 10.475/123 4
1 IMPLEMENTATIONWe create a 3D reconstruction of a scene using a combination ofthe depth and color sensors on an o�-the-shelf Microso� Kinect.To do this, we draw polygons using each point in the point cloudas a vertex, creating the appearance of a solid mesh. �e mesh isthen aligned to the RGB camera feed of the scene from the sameKinect. �is alignment gives the mesh color, and completes a 3Dreconstructed video feed.
�ere are several problems that arise with the 3D constructedfeed. First, the monocular feed creates “depth shadows” in areaswhere there is no direct line-of-sight to the depth sensor. Second,the depth camera is laterally o�set from the RGB camera (sincethey cannot physically occupy the same space) and therefore haveslightly di�erent viewing angles, creating further depth shadowing.�e resulting data feed is sparse and cannot represent the wholescene (see Figure 3. To solve this, we align the 3D depth feedwith the 2D RGB feed from the Kinect. By compositing the depthfeed over a 2D backdrop, the system e�ectively masks these depthshadows, creating a seamless composite that can then be populatedwith 3D CG assets.
�is mixed reality platform centers around the simple se�ingof a bench. �e bench works in an novel way to constrain a fewproblems, such as identifying where a user is and subsequentlyinferring the direction of the user�s gaze (i.e., toward the screen).It creates a stage with a foreground and background, with thebench occupants in the middle ground. �e bench also acts asa controller; the mixed reality experience won’t trigger until atleast one person is detected si�ing on the bench. Further, di�erentseating formations on the bench trigger di�erent experiences.
Magic Bench is a custom So�ware and custom Hardware plat-form, necessitating a solution to bridge both aspects. Between thetwo exists a series of patches created in Cycling ’74Max designed toconvert signals sent from the game engine (via OSC) about the po-sitions and states of objects in the scene, into the haptic sensationsfelt on the bench.
Haptic actuators are dynamically driven based on the locationof animated content. �e driving waveform for each actuatoris designed according to the desired feel — in the current setupwe can tweak base frequency, frequency of modulation, generalamplitude, amplitude envelope, and three-dimensional position.
1
Figure 2: Flowchart of the Magic Bench installation.
Figure 3: Reconstruction of the scene within the game en-gine.
�ese parameters can be manually tuned and/or adjusted in realtime.
2 INSTALLATION OPTIONS�is piece can run as a traditional VR Village installation or asan autonomous piece in an unsuspecting area at SIGGRAPH —imagine si�ing on a bench to rest your feet or check your email; infront of you is a screen showing a SIGGRAPH showreel. Once thesystem detects you, the content switches to a video feed of you,creating a mirror e�ect. From there, an unexpected AR experienceunfolds.
2