global lab: an interaction, simulation, and...

2
Global Lab: an Interaction, Simulation, and Experimentation Platform based on ”Second Life” and ”OpenSimulator” Anette von Kapri 1 , Sebastian Ullrich 1,2 , Boris Brandherm 1 , Helmut Prendinger 1 1 National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-8430, Japan 2 Virtual Reality Group, RWTH Aachen University, Seffenter Weg 23, 52074 Aachen, Germany [email protected], [email protected], [email protected], [email protected] ABSTRACT In this paper we describe a novel platform for virtual worlds called Global Lab. Through example applications, we demonstrate the capabilities of this platform for ubiquitous simulation and experi- mentation for sensor-based systems. Furthermore, with multime- dia information representation and support for bi-directional com- munication real and virtual devices are connected in the example of a participatory ecosystem scenario. The Global Lab contains multimodal presentation and interaction techniques which enhance universal communication. The overall motivation is to provide a framework with strong support for collaboration in virtual worlds for research purposes. Keywords: second life, opensimulator, metaverse, simulation, sensor-based systems, bots, authoring language 1 I NTRODUCTION &GLOBAL LAB As three-dimensional (3D) virtual worlds like SecondLife (SL) [1] and OpenSimulator (OpenSim) [3] become more mature and tech- nically advanced, they can be used for serious applications that are driven by research. SL provides a free networked multi-user 3D environment and is very popular with an increasing amout of regis- tered users (over 15 million as of October 2008) and about 65,000 users online at peak times [1]. In contrast to SL, OpenSim is an open source project aiming to create and deploy metaverses [3]. The goal of the originators is to provide an open and extensible plat- form, which can be hosted freely rather than the proprietary servers of SL that are hosted by Linden Labs exclusively. Both systems depend on user created content and have program- ming interfaces to create new functionality. The networked virtual environment allows for intuitive interaction. Our contribution is a new platform (Global Lab) for these virtual worlds to support re- search. In the next section we want to describe our demonstrator applications that contribute to different fields of research and em- ploy multimedia technology. 2 DEMONSTRATOR APPLICATIONS We will first explain how visualizing and testing simulations of sensor-based systems can be simplified in these virtual world. Then, we will describe a participatory ecosystem by the example of a rice field with a field server (provides sensor data), where we created a link between the real and the virtual world. At last, we elaborate possible improvements in online communication and in- teraction. 2.1 Ubiquitous Simulation We present a user-friendly approach for simulating and testing sensor-based systems that are popular in ubiquitous computing. For optimization purposes it is crucial to visualize all the sensor Figure 1: Example of the simulated positioning system in SL with a Visitor Avatar experiencing the system and a Developer Avatar inter- actively adjusting the properties of a virtual RFID tag. data and offer to intuitively change parameters and spatial position of the devices. A recent example of a testbed is limited to two- dimensional space [4]. Therefore, our system uses SL with the key novelty of simulat- ing, visualizating, and interacting in 3D [9, 5]. In three dimensions spatial characteristics of sensors and sensor networks can be mod- eled more accurately. The testbed can be experienced immersively. Sensor models and other objects can be moved easily and intuitively by ’direct’ (avatar-mediated) manipulation. Figure 1 shows a sim- ulation of an indoor positioning system in SL for which objects were created that represent Radio Frequency Identification (RFID) tags [6]. These can be positioned within the virtual environment interactively and the effects of changes are calculated and visual- ized in real-time. The testbed architecture implements a flexible framework for integrating sensor simulation into a customizable, interactive 3D virtual environment. Extensible interfaces support the communication with any kind of sensors (real or emulated). 2.2 Participatory Ecosystem This is an example on how to support online decision-making in agriculture. Here it is important to provide a platform in which all relevant information is displayed concurrently in a shared view for all visitors. In our demonstrator, visitors can meet in SL and ex- perience a virtual 3D rice field that is augmented by information collected from a so called field server [2]. This device captures and aggregates simultaneously data from physical sensors on the rice field like temperature, humidity and real-time video image that can be streamed into the virtual environment (cp. Figure 2). By embedding all the relevant information in a networked multi-user environment, we have created a virtual meeting space. This allows agriculture experts and owners of rice fields to meet and discuss

Upload: others

Post on 07-Jun-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Global Lab: an Interaction, Simulation, and ...research.nii.ac.jp/~prendinger/papers/anette-psivt2009.pdf · Global Lab: an Interaction, Simulation, and Experimentation Platform based

Global Lab: an Interaction, Simulation, and Experimentation Platformbased on ”Second Life” and ”OpenSimulator”

Anette von Kapri1, Sebastian Ullrich1,2, Boris Brandherm1, Helmut Prendinger1

1National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-8430, Japan2Virtual Reality Group, RWTH Aachen University, Seffenter Weg 23, 52074 Aachen, Germany

[email protected], [email protected], [email protected], [email protected]

ABSTRACT

In this paper we describe a novel platform for virtual worlds calledGlobal Lab. Through example applications, we demonstrate thecapabilities of this platform for ubiquitous simulation and experi-mentation for sensor-based systems. Furthermore, with multime-dia information representation and support for bi-directional com-munication real and virtual devices are connected in the exampleof a participatory ecosystem scenario. The Global Lab containsmultimodal presentation and interaction techniques which enhanceuniversal communication. The overall motivation is to provide aframework with strong support for collaboration in virtual worldsfor research purposes.

Keywords: second life, opensimulator, metaverse, simulation,sensor-based systems, bots, authoring language

1 INTRODUCTION & GLOBAL LAB

As three-dimensional (3D) virtual worlds like SecondLife (SL) [1]and OpenSimulator (OpenSim) [3] become more mature and tech-nically advanced, they can be used for serious applications that aredriven by research. SL provides a free networked multi-user 3Denvironment and is very popular with an increasing amout of regis-tered users (over 15 million as of October 2008) and about 65,000users online at peak times [1]. In contrast to SL, OpenSim is anopen source project aiming to create and deploy metaverses [3].The goal of the originators is to provide an open and extensible plat-form, which can be hosted freely rather than the proprietary serversof SL that are hosted by Linden Labs exclusively.

Both systems depend on user created content and have program-ming interfaces to create new functionality. The networked virtualenvironment allows for intuitive interaction. Our contribution is anew platform (Global Lab) for these virtual worlds to support re-search. In the next section we want to describe our demonstratorapplications that contribute to different fields of research and em-ploy multimedia technology.

2 DEMONSTRATOR APPLICATIONS

We will first explain how visualizing and testing simulations ofsensor-based systems can be simplified in these virtual world.Then, we will describe a participatory ecosystem by the exampleof a rice field with a field server (provides sensor data), where wecreated a link between the real and the virtual world. At last, weelaborate possible improvements in online communication and in-teraction.

2.1 Ubiquitous SimulationWe present a user-friendly approach for simulating and testingsensor-based systems that are popular in ubiquitous computing.For optimization purposes it is crucial to visualize all the sensor

Figure 1: Example of the simulated positioning system in SL with aVisitor Avatar experiencing the system and a Developer Avatar inter-actively adjusting the properties of a virtual RFID tag.

data and offer to intuitively change parameters and spatial positionof the devices. A recent example of a testbed is limited to two-dimensional space [4].

Therefore, our system uses SL with the key novelty of simulat-ing, visualizating, and interacting in 3D [9, 5]. In three dimensionsspatial characteristics of sensors and sensor networks can be mod-eled more accurately. The testbed can be experienced immersively.Sensor models and other objects can be moved easily and intuitivelyby ’direct’ (avatar-mediated) manipulation. Figure 1 shows a sim-ulation of an indoor positioning system in SL for which objectswere created that represent Radio Frequency Identification (RFID)tags [6]. These can be positioned within the virtual environmentinteractively and the effects of changes are calculated and visual-ized in real-time. The testbed architecture implements a flexibleframework for integrating sensor simulation into a customizable,interactive 3D virtual environment. Extensible interfaces supportthe communication with any kind of sensors (real or emulated).

2.2 Participatory EcosystemThis is an example on how to support online decision-making inagriculture. Here it is important to provide a platform in which allrelevant information is displayed concurrently in a shared view forall visitors. In our demonstrator, visitors can meet in SL and ex-perience a virtual 3D rice field that is augmented by informationcollected from a so called field server [2]. This device capturesand aggregates simultaneously data from physical sensors on therice field like temperature, humidity and real-time video image thatcan be streamed into the virtual environment (cp. Figure 2). Byembedding all the relevant information in a networked multi-userenvironment, we have created a virtual meeting space. This allowsagriculture experts and owners of rice fields to meet and discuss

Page 2: Global Lab: an Interaction, Simulation, and ...research.nii.ac.jp/~prendinger/papers/anette-psivt2009.pdf · Global Lab: an Interaction, Simulation, and Experimentation Platform based

Figure 2: Bi-directional interaction for a virtual and real rice field.Temperature, humidity and real-time camera images are capturedand visualized in the virtual world.

easily despite large real-world distances. Furthermore, it encour-ages to share knowledge and teach internationally. For example,rice is being grown in many countries and Japan has a long tradi-tion and much accumulated wisdom on this topic.

We have enabled bi-directional interaction with the field sen-sor. One direction is the streaming of video images into the virtualworld. Furthermore, the opposite direction allows us to manipulatethe physical camera through controlling its virtual counterpart. Toour knowledge this is the first time that a camera is controlled froma virtual world. Usually, there are only web-based interfaces. Theadvantage of our approach is that the changes can be seen by mul-tiple users. Furthermore, the coupled movement of the real cameraand its virtual representation makes the controlling more intuitive.

2.3 Universal CommunicationWe want to facilitate natural and intuitive online communication invirtual environments and give two examples here.

Animation Authoring We present a multimodal presentationscenario with our XML-based scripting language called Multi-modal Presentation Markup Language (MPML3D) [10] for anima-tion authoring of bots. In Figure 3 we see two bots in SL givinga presentation. They are controlled by an MPML3D script. Assoon as a user steps on the sensor mat a perception is triggeredwhich starts the MPML3D presentation. Typically, bots will act outa dialogue as defined in the script with multimodal output consist-ing of synthesized speech output and non-verbal behavior displayedas gestures and facial expressions. Due to the networked environ-ment multiple users can co-experience the same presentation. Thiseasy-to-use scripting language can be integrated in the feature andcontent rich multi-user online environments of SL and OpenSimwhere bots are currently missing. Bots can be used for entertain-ment, education, or research purposes as being part of multimodalpresentations, user studies or for testing a simulation environmentfor example.

Figure 3: MPML3D presentation withtwo presenting bots.

Figure 4: Happy Emo-tion Bubble.

Emotional Behavior Instant Messaging Usually, communica-tion in virtual worlds is done by simple text chat that is displayedon the screen. With our system we enhance the communication byanalyzing the tone and sentiment of the inputted text and animatethe avatar in a way that displays the emotion of the message. Thisresults in a more life-like communication. The rule-based emo-tional recognization algorithm [8] distinguishes nine emotions andcategorizes the phrases, clauses, and, finally, sentences in emotionalvectors with different intensities for each emotion. As visual feed-back, emotion bubbles are shown above the avatars. The bubblesdisplay the analyzed emotion as textures that contain smileys (cp.Figure 4). In addition, depending on the meaning of the sentencethe corresponding gesture is selected and played back automati-cally [7].

3 CONCLUSION

In this paper we have presented the novel Global Lab project whichuses the metaverses of Second Life and OpenSimulator as interac-tion, simulation and experimentation platform. It consists of manydifferent components that contribute to several research topics. Allcomponents are explained by a use-case and demonstrator. Onecomponent is the extensible visualization and testing platform forsensor-based systems which enables a fast and simple simulationrun. Then, we have described an interactive meeting space forecosystems that is enriched by multimedia technology. It connectsthe virtual with the real world more closely to be able to make on-line decisions fast. As last components, we presented our universalcommunication tools; our scripting language MPML3D, the visu-alization of emotions and the automatic content creation try to en-hance online interactions in 3D worlds.

REFERENCES

[1] Linden Lab, web designer. Official Website of Second Life.http://www.secondlife.com. last visited: 2008/10/11.

[2] National Agricultural Research Center. Field Server webpage.http://model.job.affrc.go.jp/FieldServer/FieldServerEn/default.htm.last visited 2008/10/11.

[3] OpenSim Developer Team. Official website of opensimulator project.http://www.opensimulator.org. last visited 2008/10/11.

[4] I. Armac and D. Retkowitz. Simulation of Smart Environments. InICPS’07 : IEEE Int’l Conf on Pervasive Services, pages 257–266,2007.

[5] B. Brandherm, S. Ullrich, and H. Prendinger. Simulation Frameworkin Second Life with Evaluation Functionality for Sensor-based Sys-tems. In Proc. of UbiComp ’08 Workshop W2 – Ubiquitous SystemsEvaluation (USE ’08), Seoul, South Korea, September 2008.

[6] B. Brandherm, S. Ullrich, and H. Prendinger. Simulation of sensor-based tracking in Second Life. In AAMAS’08: Proceedings of the7th Int’l Conf on Autonomous Agents and Multiagent Systems, pages1689–1690, 2008.

[7] W. Breitfuss, H. Prendinger, and M. Ishizuka. Automatic Generationof Conversational Behavior for Multiple Embodied Virtual Charac-ters: The Rules and Models behind Our System. In IVA’08: Proceed-ings of the 8th Int’l Conf on Intelligent Virtual Agents, pages 472–473,2008.

[8] A. Neviarouskaya, H. Prendinger, and M. Ishizuka. Textual Af-fect Sensing for Sociable and Expressive Online Communication. InACII’07: Proceedings of the 2nd Int’l Conf on Affective Computingand Intelligent Interaction, pages 218–229, 2007.

[9] S. Ullrich, B. Brandherm, and H. Prendinger. Simulation Frameworkwith Testbed for Sensor-based Systems in Second Life. In Proceed-ings of 10th Int’l Conf on Ubiquitous Computing (UbiComp 2008)Demo Session, Seoul, South Korea, September 2008. ACM Press.

[10] S. Ullrich, K. Brugmann, H. Prendinger, and M. Ishizuka. ExtendingMPML3D to Second Life. In IVA’08: Proceedings of the 8th Int’lConf on Intelligent Virtual Agents, pages 281–288, 2008.