an extensible platform for interactive, entertaining ... ?· an extensible platform for...

Download An Extensible Platform for Interactive, Entertaining ... ?· An Extensible Platform for Interactive,…

Post on 10-Jul-2018




0 download

Embed Size (px)


  • An Extensible Platform for Interactive, Entertaining SocialExperiences with an Animatronic Character

    Sabrina HaskellEntertainment Technology

    CenterCarnegie Mellon University

    700 Technology Dr., Rm. 5305Pittsburgh, PA 15219

    Andrew HosmerEntertainment Technology

    CenterCarnegie Mellon University

    700 Technology Dr., Rm. 5305Pittsburgh, PA 15219

    Eugenia LeuEntertainment Technology

    CenterCarnegie Mellon University

    700 Technology Dr., Rm. 5305Pittsburgh, PA 15219

    ABSTRACTThe fields of human-robot interaction and entertainmentrobotics are truly interdisciplinary, combining the best ofcomputer science, psychology, and mechanical engineering.However, they are areas which have thus far been largelylimited to technical specialists, excluding animators, writers,and other artists from the development of interactive con-tent. We attempt to open these fields to non-technologiststhrough an easily extensible platform for the rapid develop-ment of social interactions between humans and animatroniccharacters. Specifically, the Interbots Platform consists of1) a servo-driven animatronic character named Quasi and 2)a suite of tools to author decisions, animate the character,and design complete, interactive, entertaining social experi-ences. We have created a graphical editor to author behav-iors for the character, as well as extensions for Alias Mayaand Macromedia Director, empowering artists already fa-miliar with these applications to create animatronic-enabledcontent. The usability of the Interbots Platform was veri-fied when interdisciplinary student teams of artists and com-puter scientists were able to create engaging entertainmentexperiences with Quasi in two-week development cycles.

    1. INTRODUCTIONThe past past five years have seen an explosion in the con-sumer robotics industry. With over one million units soldsince their introduction in 2002, iRobots Roomba vacuumcleaner has paved the way for a robotic presence in the home[8]. Realizing that consumers crave entertainment as muchas (if not more than) amenities, companies began introduc-ing entertainment robots into the market. The popularityof Sonys AIBO dog and Wow Wee Toys Robosapien havedemonstrated that entertainment robotics is a viable indus-try.

    Unlike the traditional linearly pre-scripted show animatron-

    ics found in many theme and amusement parks, the noveltyof household entertainment robots is in their interactivity.Sonys AIBO, for example, has the ability to locate specificobjects, recognize its owners voice, and sense when it isbeing touched. The robotic dogs are capable of feelingsix emotions, and are shipped with a default personalitythat changes over time as their owners interact with them.The latest versions of AIBO feature official support for wire-less remote control as well as the ability to download smallpersonality additions such as games and sounds. More ad-vanced features such as total personality make-overs andthe ability to create custom interactive games are not of-ficially supported. Ambitious hobbyists may utilize officialand third party software developer kits to fully customizetheir AIBO, however, these kits require knowledge of low-level programming concepts.

    The Interbots Initiative research group at the Entertain-ment Technology Center (ETC) seeks to leverage the suc-cess of household entertainment robotics and the interestof the hobbyist community to introduce interactivity to thefield of higher-end custom animatronics. We also intend togive non-technologists the ability to develop content for in-teractive animatronic experiences. These objectives will beaccomplished through two main goals: 1) Develop a fully in-teractive, autonomous animatronic character that is believ-able and entertaining, and 2) Develop an extensible platformthat allows artists and other non-technologists to rapidly de-sign and create complete, interactive, and entertaining so-cial experiences with animatronic characters. We feel thatan embodied character is critical to a believable animatronicexperience no matter how impressive the technology, it isimpossible to suspend an audiences disbelief and create theillusion of life without an engaging character and personal-ity. We also believe that the most successful entertainmentexperiences involve the contributions of writers, animators,sculptors, painters, and other artists. Therefore, the de-velopment of simple, intuitive content creation tools thatallow non-technologists to author content for entertainmentrobotics is essential.

    In accordance with these goals, we have created a believ-able animatronic character named Quasi, and an extensibleplatform that allows for the development of interactive so-cial experiences with Quasi. Interactivity designers need notrely on pre-manufactured content; the Interbots Platform al-

  • lows new, experiences to be developed quickly and easily. Noprogramming or technical knowledge is required developerscan create new behaviors for the animatronic character usingsimple custom graphical authoring tools. These tools allowfor Quasis interactive experiences to range from fully au-tonomous to completely pre-scripted. The usability of theauthoring tools was verified when teams of four studentswith no previous knowledge of the Interbots Platform wereable to develop a theatrical karaoke show, an interactivephoto kiosk, a live puppeteering interface, and interactivedream sequences in two week development periods.

    The remainder of this paper discusses the various compo-nents of the Interbots Platform and the content authoringprocess. Section 2 briefly reviews related work in the fieldof animatronics. The design and construction of the Plat-forms hardware is discussed in Section 3. Section 4 de-scribes the figures control software and the content author-ing tools. Several examples of rapidly developed interactive,entertainment experiences are presented in Section 5. Thefuture direction of this research is discussed in Section 6,and conclusions are presented in Section 7.

    2. RELATED WORKThis project builds on previous work by Carnegie MellonUniversitys Entertainment Technology Center students, butsignificantly extends and expands upon it. The ETCs Inter-active Animatronics Initiative began with the Doc BeardsleyProject [3] which sought to develop technology that enablesanimatronic figures to become conversationally interactive.This goal was met by combining a multitude of technologies(speech recognition, synthetic interview, discussion engine,audio, vision, and animatronics), which resulted in a newmedium for interactive entertainment. Realizing the limita-tions of a conversational engine, the renamed Interbots Ini-tiative has refocused on creating an expressive, believablecharacter and developing an extensible platform for the cre-ation of interactive, social experiences with an animatronicrobot.

    Advancements in animatronic technology differ between aca-demia and industry, with academia concentrating on thecreation of artificial intelligence and industry focusing onhardware improvements. Within academia, our approach isrelated to that of Breazeal et al [6], Pineau et al [9], andBruce et al [7], whose research with humanoid and anima-tronic robots explores social aspects of human-robot inter-action. Our work is most similar to that of Breazeal etal, whose research with their animatronic robot Leonardocenters around human-robot collaboration and learning [6].Like Leonardo, our animatronic robot Quasi is influenced bybehavioral, emotional, and environmental motivators. How-ever, our focus is on engaging humans in an entertaining ex-perience, and not specifically on task-collaboration or robot-learning. The biggest difference in our approach is thatwe are not concerned with replicating human intellect andemotional processes rather, we wish to emulate it. Conse-quently, our work is influenced by theatre and focuses largelyon the character and personality of the animatronic robot.In terms of creating the illusion of a living character, theaudience cannot distinguish between a character that ac-tually thinks like a human and a character that appearsto think like a human. Instead of focusing on artificial in-

    telligence research, we have chosen to apply our efforts tomechanisms for creating complete interactive entertainmentexperiences with an animatronic character.

    Our work is also similar to Breazeal et al s Interactive RobotTheatre [5], a robotic terrarium featuring an interactive,anemone-like creature. Our work largely differs in the depthof interactivity and audience participation; the anemonesinteractions center around tracking guests movements andrecoiling from guests when frightened. Quasi has voice recog-nition capabilities, can receive input through the touch-screen on his kiosk, and has the ability to recognize specificindividuals through the use of an active login mechanism,allowing for much deeper and more personal interactions.

    Within industry, our work is most similar to the animatron-ics found in Disney Theme Parks. Typically, these have notbeen interactive, simply playing a linear pre-programmedshow. The best of these scripted attractions can exhibit nu-anced details of personality and emotion, but are completelyindifferent to people and the changing environment aroundthem. Interesting recent developments have begun to ap-pear from the labs of Walt Disney Imagineering Research& Development, most notably Lucky the Dinosaur [4]. It isbelieved that Lucky is controlled by the man-behind-the-curtain technique, something we hope to explore ourselvesvia the Guided Performance Interface (GPI) as outlined inthe Section 6. In many ways, our extensible platform forinteractive animatronics is a bridge between academia andindustry.

    3. HARDWAREThe hardware of the


View more >