An Extensible Platform for Interactive, Entertaining ... ?· An Extensible Platform for Interactive,…

Download An Extensible Platform for Interactive, Entertaining ... ?· An Extensible Platform for Interactive,…

Post on 10-Jul-2018

212 views

Category:

Documents

0 download

Embed Size (px)

TRANSCRIPT

  • An Extensible Platform for Interactive, Entertaining SocialExperiences with an Animatronic Character

    Sabrina HaskellEntertainment Technology

    CenterCarnegie Mellon University

    700 Technology Dr., Rm. 5305Pittsburgh, PA 15219

    shaskell@andrew.cmu.edu

    Andrew HosmerEntertainment Technology

    CenterCarnegie Mellon University

    700 Technology Dr., Rm. 5305Pittsburgh, PA 15219

    alhosmer@andrew.cmu.edu

    Eugenia LeuEntertainment Technology

    CenterCarnegie Mellon University

    700 Technology Dr., Rm. 5305Pittsburgh, PA 15219

    eyj@andrew.cmu.edu

    ABSTRACTThe fields of human-robot interaction and entertainmentrobotics are truly interdisciplinary, combining the best ofcomputer science, psychology, and mechanical engineering.However, they are areas which have thus far been largelylimited to technical specialists, excluding animators, writers,and other artists from the development of interactive con-tent. We attempt to open these fields to non-technologiststhrough an easily extensible platform for the rapid develop-ment of social interactions between humans and animatroniccharacters. Specifically, the Interbots Platform consists of1) a servo-driven animatronic character named Quasi and 2)a suite of tools to author decisions, animate the character,and design complete, interactive, entertaining social experi-ences. We have created a graphical editor to author behav-iors for the character, as well as extensions for Alias Mayaand Macromedia Director, empowering artists already fa-miliar with these applications to create animatronic-enabledcontent. The usability of the Interbots Platform was veri-fied when interdisciplinary student teams of artists and com-puter scientists were able to create engaging entertainmentexperiences with Quasi in two-week development cycles.

    1. INTRODUCTIONThe past past five years have seen an explosion in the con-sumer robotics industry. With over one million units soldsince their introduction in 2002, iRobots Roomba vacuumcleaner has paved the way for a robotic presence in the home[8]. Realizing that consumers crave entertainment as muchas (if not more than) amenities, companies began introduc-ing entertainment robots into the market. The popularityof Sonys AIBO dog and Wow Wee Toys Robosapien havedemonstrated that entertainment robotics is a viable indus-try.

    Unlike the traditional linearly pre-scripted show animatron-

    ics found in many theme and amusement parks, the noveltyof household entertainment robots is in their interactivity.Sonys AIBO, for example, has the ability to locate specificobjects, recognize its owners voice, and sense when it isbeing touched. The robotic dogs are capable of feelingsix emotions, and are shipped with a default personalitythat changes over time as their owners interact with them.The latest versions of AIBO feature official support for wire-less remote control as well as the ability to download smallpersonality additions such as games and sounds. More ad-vanced features such as total personality make-overs andthe ability to create custom interactive games are not of-ficially supported. Ambitious hobbyists may utilize officialand third party software developer kits to fully customizetheir AIBO, however, these kits require knowledge of low-level programming concepts.

    The Interbots Initiative research group at the Entertain-ment Technology Center (ETC) seeks to leverage the suc-cess of household entertainment robotics and the interestof the hobbyist community to introduce interactivity to thefield of higher-end custom animatronics. We also intend togive non-technologists the ability to develop content for in-teractive animatronic experiences. These objectives will beaccomplished through two main goals: 1) Develop a fully in-teractive, autonomous animatronic character that is believ-able and entertaining, and 2) Develop an extensible platformthat allows artists and other non-technologists to rapidly de-sign and create complete, interactive, and entertaining so-cial experiences with animatronic characters. We feel thatan embodied character is critical to a believable animatronicexperience no matter how impressive the technology, it isimpossible to suspend an audiences disbelief and create theillusion of life without an engaging character and personal-ity. We also believe that the most successful entertainmentexperiences involve the contributions of writers, animators,sculptors, painters, and other artists. Therefore, the de-velopment of simple, intuitive content creation tools thatallow non-technologists to author content for entertainmentrobotics is essential.

    In accordance with these goals, we have created a believ-able animatronic character named Quasi, and an extensibleplatform that allows for the development of interactive so-cial experiences with Quasi. Interactivity designers need notrely on pre-manufactured content; the Interbots Platform al-

  • lows new, experiences to be developed quickly and easily. Noprogramming or technical knowledge is required developerscan create new behaviors for the animatronic character usingsimple custom graphical authoring tools. These tools allowfor Quasis interactive experiences to range from fully au-tonomous to completely pre-scripted. The usability of theauthoring tools was verified when teams of four studentswith no previous knowledge of the Interbots Platform wereable to develop a theatrical karaoke show, an interactivephoto kiosk, a live puppeteering interface, and interactivedream sequences in two week development periods.

    The remainder of this paper discusses the various compo-nents of the Interbots Platform and the content authoringprocess. Section 2 briefly reviews related work in the fieldof animatronics. The design and construction of the Plat-forms hardware is discussed in Section 3. Section 4 de-scribes the figures control software and the content author-ing tools. Several examples of rapidly developed interactive,entertainment experiences are presented in Section 5. Thefuture direction of this research is discussed in Section 6,and conclusions are presented in Section 7.

    2. RELATED WORKThis project builds on previous work by Carnegie MellonUniversitys Entertainment Technology Center students, butsignificantly extends and expands upon it. The ETCs Inter-active Animatronics Initiative began with the Doc BeardsleyProject [3] which sought to develop technology that enablesanimatronic figures to become conversationally interactive.This goal was met by combining a multitude of technologies(speech recognition, synthetic interview, discussion engine,audio, vision, and animatronics), which resulted in a newmedium for interactive entertainment. Realizing the limita-tions of a conversational engine, the renamed Interbots Ini-tiative has refocused on creating an expressive, believablecharacter and developing an extensible platform for the cre-ation of interactive, social experiences with an animatronicrobot.

    Advancements in animatronic technology differ between aca-demia and industry, with academia concentrating on thecreation of artificial intelligence and industry focusing onhardware improvements. Within academia, our approach isrelated to that of Breazeal et al [6], Pineau et al [9], andBruce et al [7], whose research with humanoid and anima-tronic robots explores social aspects of human-robot inter-action. Our work is most similar to that of Breazeal etal, whose research with their animatronic robot Leonardocenters around human-robot collaboration and learning [6].Like Leonardo, our animatronic robot Quasi is influenced bybehavioral, emotional, and environmental motivators. How-ever, our focus is on engaging humans in an entertaining ex-perience, and not specifically on task-collaboration or robot-learning. The biggest difference in our approach is thatwe are not concerned with replicating human intellect andemotional processes rather, we wish to emulate it. Conse-quently, our work is influenced by theatre and focuses largelyon the character and personality of the animatronic robot.In terms of creating the illusion of a living character, theaudience cannot distinguish between a character that ac-tually thinks like a human and a character that appearsto think like a human. Instead of focusing on artificial in-

    telligence research, we have chosen to apply our efforts tomechanisms for creating complete interactive entertainmentexperiences with an animatronic character.

    Our work is also similar to Breazeal et al s Interactive RobotTheatre [5], a robotic terrarium featuring an interactive,anemone-like creature. Our work largely differs in the depthof interactivity and audience participation; the anemonesinteractions center around tracking guests movements andrecoiling from guests when frightened. Quasi has voice recog-nition capabilities, can receive input through the touch-screen on his kiosk, and has the ability to recognize specificindividuals through the use of an active login mechanism,allowing for much deeper and more personal interactions.

    Within industry, our work is most similar to the animatron-ics found in Disney Theme Parks. Typically, these have notbeen interactive, simply playing a linear pre-programmedshow. The best of these scripted attractions can exhibit nu-anced details of personality and emotion, but are completelyindifferent to people and the changing environment aroundthem. Interesting recent developments have begun to ap-pear from the labs of Walt Disney Imagineering Research& Development, most notably Lucky the Dinosaur [4]. It isbelieved that Lucky is controlled by the man-behind-the-curtain technique, something we hope to explore ourselvesvia the Guided Performance Interface (GPI) as outlined inthe Section 6. In many ways, our extensible platform forinteractive animatronics is a bridge between academia andindustry.

    3. HARDWAREThe hardware of the Interbots Platform currently consistsof three major components: the animatronic figure, thekiosk in which the figure resides, and the show control hard-ware. The design and construction process of the anima-tronic character is detailed in Section 3.1, the kiosk and itsmany input sensors are described in Section 3.2, and someof the show control hardware used in the Platform is brieflydiscussed in Section 3.3.

    3.1 Quasi the RobotThe most visible component of the Interbots Platform is,of course, the physical robot. For our research we havedeveloped a custom-built figure we call Quasi. Quasiis an expressive anthropomorphic humanoid robot with aslightly masculine appearance. Recent developments haveintroduced pre-recorded speech, but a substantial amountof his activity is non-verbal, instead relying expressive facialfeatures and gestures.

    Having the liberty to custom-design a figure for our pur-poses, we started our development process in character de-sign. The approach was similar to the one taken for film orvideo-game characters, but with some notable differences. Ablue-sky conceptual phase led to three varied concepts oneof which was a robot in form. Iteration led us to move for-ward with the robot figure after merging the best qualitiesfrom all three designs. Pencil sketches established a designstyle, which was then fleshed out during the creation of a3D model in Alias Maya an industry preferred modelingand animation package. At this point physical constraintsbecame a notable concern. When designing a virtual charac-

  • ter, mass, realistic joint sizes, and other mechanical factorsdo not need to be considered. For animatronic characters,real-world physical limitations require function to supercedeform. We paired a traditional 3D artist with the industrialdesigner who built the robot in order to ensure that the de-sign was both functional and appealing. A copy of the 3Dmodel was imported into SolidWorks, a mechanical designapplication. The placement of the internal joints, motors,and armatures occurred in SolidWorks parallel to iterationson the cosmetic look of the character in Maya.

    Fabrication began once the external cosmetic shell and inter-nal armature had been finalized in virtual space. Engineer-ing drawings from SolidWorks were used to hand-machinethe armature, while the shell pieces modeled in Maya wereexported and fabricated on a 3D printer. The pieces fromthe 3D printer were not nearly durable nor lightweight enoughto use directly on the robot. This problem was solved bymaking molds of the printed pieces and casting final partsout of a fiberglass-reinforced resin. These parts could standup to the challenge of being used on the robot, and hadthe benefit of being a one-to-one match of the 3D Mayamodel. As discussed in Section 4.2.3, this allows artists touse Mayas animation tools to create lifelike movement forthe physical robot.

    In terms of physical construction, Quasi has more in com-mon with a radio-controlled model aircraft than with a tra-ditional robot. His armature utilizes a large number (27) ofvariously-sized hobby servo motors to move. The armatureitself is machined out of lightweight aluminum. To facilitatethe characters ability to express emotion, ten servos havebeen dedicated to movements of the eyes and eyelids alone.(In addition to telescoping in and out, each eye has an up-per and lower eyelid that can be raised, lowered, and tilteddiagonally to emulate the movement of eyebrows.) Hobbyservos combine a DC rotational motor, a potentiometer forposition feedback, and a motor control board in one tidypackage. This, along with their low cost has made them in-creasingly popular in the field of robotics. While our methodof containing all servos inside the armature allows for Quasito be quite compact, we are looking at iterating upon thedesign by moving all servos outside of the character and us-ing sleeved push-pull cables to transfer the mechanical forcesfrom a bank of servo motors to the armature itself.

    Quasi has a number of features in addition to his eyelids forconveying emotion, the most prominent of which are LEDlighting fixtures for his eyes and antennae. These combinered, green, and blue LEDs to display any color of the spec-trum. His antennae can move both forward and backwardas well as in and out, giving them an expressive qualitynot unlike that of a dogs ears. The physical movementsof Quasis eyelids, antennae, and body posture combinedwith changing LED colors allow him to effectively commu-nicate emotions and personality without the use of speech.Now that silent emotional expression has been achieved, ourcurrent research is exploring the use of speech to augmentexpression. Quasi will always be a work in progresswe havelearned a great deal about the strengths and limitations ofour design and we hope to improve the technology in futureiterations of the figure.

    Figure 1: The animatronic character Quasi, hiskiosk, and its many sensors.

    3.2 Quasis KioskInteractive experiences with the Interbots Platform involvemore than just an animatronic figure alone. Quasis envi-ronment - a kiosk that also contains an LCD touch-screen,long range IR sensors, a motion detector, a w...

Recommended

View more >