using a game engine for vr simulations in evacuation ... · using a game engine for vr simulations...

7
May/June 2008 Published by the IEEE Computer Society 0272-1716/08/$25.00 © 2008 IEEE Applications Editor: Mike Potel www.wildcrest.com Using a Game Engine for VR Simulations in Evacuation Planning Antônio Carlos A. Mól and Carlos Alexandre F. Jorge Comissão Nacional de Energia Nuclear, Brazil Pedro M. Couto Faculdades Paraíso, Brazil R esearchers who want to use virtual reality- based simulations in their work must address some important requirements. Simulations require a good 3D graphical rendering capability. For realistic results, they must also consider the effects of physical laws, such as gravity and colli- sion forces. A software platform that meets these requirements can serve a broad range of science and technology applications, but developing an entire platform is hard work itself. Fortunately, some new computer game platforms can meet VR simulation requirements. We’ve used one of them for applications related to safety-criti- cal plants at the Instituto de Engenharia Nuclear (IEN, Nuclear Engineering Institute). IEN is a re- search institute of the Comissão Nacional de En- ergia Nuclear (CNEN, Brazilian Commission of Nuclear Energy). IEN facilities include a research nuclear reactor, particle accelerators, and chemi- cal laboratories. In this article, we present our use of a game engine for virtual simulations of building evacu- ations in emergency situations. We have modeled a real IEN building in 3D to perform preliminary evacuation tests, prior to real ones, and thus sup- port evacuation planning. Nuclear plants repre- sent just one of many environments where virtual simulations might be the best or only means of evaluating situations that are too dangerous to simulate in real environments—for example, in the presence of fire and smoke or radioactive or chemical contamination. We began our work with help from people at our institution who work directly with safety engi- neering and emergency planning. Although results from preliminary tests of our VR simulation sys- tem aren’t yet conclusive, they do compare favor- ably with real simulations of evacuation scenarios. We plan to move forward with these tests and ex- pect the results to give insight to and suggestions about building and office layouts—for example, door and stair locations and dimensions—and alternative exit routes that support the needs of emergency planners. Game engines A game engine is that part of a computer game’s software that contains good 3D graphical rendering and representations of physical laws. Game engines are typically independent of the specific scenarios or applications for which they might originally have been developed, and the source code for some game engines is partially open. Researchers can therefore use game engine source codes to create totally new scenarios and applications. Most game engines include networking capabili- ties, which can also serve for multiuser simulation. And last, but by no means least of their advan- tages for supporting VR simulation, game engines are relatively inexpensive. Several research groups have used game engines for applications outside the entertainment field. Communications of the ACM has run three special issues that give a good overview of their use in scientific and technological applications. 1–3 Two current engines that fulfill technological application needs are Unreal from Epic Games and Quake from ID Software. 1 They offer equivalent functionalities and usability. We’ve used Unreal- Engine2 Runtime Demo Version in our evacua- tion planning R&D. This engine has worked well from the beginning, so we haven’t felt the need to try Quake, but it should work as well. The Unreal Engine is free for academic and noncommercial use. It’s available for download from http://udn. epicgames.com/Two/UnrealEngine2Runtime. html, where you will also find the end-user license agreement. Unreal Engine Unreal Engine supports multiple users simulta- neously through a local area network or the In- ternet. Each user navigates within the virtual environment, interacting with both the scenario

Upload: trinhtram

Post on 04-May-2018

227 views

Category:

Documents


2 download

TRANSCRIPT

� May/June2008 PublishedbytheIEEEComputerSociety 0272-1716/08/$25.00©2008IEEE

Applications Editor: Mike Potelwww.wildcrest.com

UsingaGameEngineforVRSimulationsinEvacuationPlanning

Antônio Carlos A. Mól and Carlos Alexandre F. JorgeComissão Nacional de Energia Nuclear, Brazil

Pedro M. CoutoFaculdades Paraíso, Brazil

Researchers who want to use virtual reality-based simulations in their work must address some important requirements. Simulations

require a good 3D graphical rendering capability. For realistic results, they must also consider the effects of physical laws, such as gravity and colli-sion forces. A software platform that meets these requirements can serve a broad range of science and technology applications, but developing an entire platform is hard work itself.

Fortunately, some new computer game platforms can meet VR simulation requirements. We’ve used one of them for applications related to safety-criti-cal plants at the Instituto de Engenharia Nuclear (IEN, Nuclear Engineering Institute). IEN is a re-search institute of the Comissão Nacional de En-ergia Nuclear (CNEN, Brazilian Commission of Nuclear Energy). IEN facilities include a research nuclear reactor, particle accelerators, and chemi-cal laboratories.

In this article, we present our use of a game engine for virtual simulations of building evacu-ations in emergency situations. We have modeled a real IEN building in 3D to perform preliminary evacuation tests, prior to real ones, and thus sup-port evacuation planning. Nuclear plants repre-sent just one of many environments where virtual simulations might be the best or only means of evaluating situations that are too dangerous to simulate in real environments—for example, in the presence of fire and smoke or radioactive or chemical contamination.

We began our work with help from people at our institution who work directly with safety engi-neering and emergency planning. Although results from preliminary tests of our VR simulation sys-tem aren’t yet conclusive, they do compare favor-ably with real simulations of evacuation scenarios. We plan to move forward with these tests and ex-pect the results to give insight to and suggestions about building and office layouts—for example, door and stair locations and dimensions—and

alternative exit routes that support the needs of emergency planners.

GameenginesA game engine is that part of a computer game’s software that contains good 3D graphical rendering and representations of physical laws. Game engines are typically independent of the specific scenarios or applications for which they might originally have been developed, and the source code for some game engines is partially open. Researchers can therefore use game engine source codes to create totally new scenarios and applications.

Most game engines include networking capabili-ties, which can also serve for multiuser simulation.

And last, but by no means least of their advan-tages for supporting VR simulation, game engines are relatively inexpensive.

Several research groups have used game engines for applications outside the entertainment field. Communications of the ACM has run three special issues that give a good overview of their use in scientific and technological applications.1–3

Two current engines that fulfill technological application needs are Unreal from Epic Games and Quake from ID Software.1 They offer equivalent functionalities and usability. We’ve used Unreal-Engine2 Runtime Demo Version in our evacua-tion planning R&D. This engine has worked well from the beginning, so we haven’t felt the need to try Quake, but it should work as well. The Unreal Engine is free for academic and noncommercial use. It’s available for download from http://udn.epicgames.com/Two/UnrealEngine2Runtime.html, where you will also find the end-user license agreement.

UnrealEngineUnreal Engine supports multiple users simulta-neously through a local area network or the In-ternet. Each user navigates within the virtual environment, interacting with both the scenario

IEEEComputerGraphicsandApplications �

and other users’ avatars. Interactions include col-lisions. Unreal Engine defines an avatar’s collision volume as a cylinder, which researchers can adapt as desired. The software end users choose between first- and third-person viewpoints and simulate different visual characteristics for their avatars, enabling easy identification.

Because the Unreal Engine’s source code is par-tially open, developers can adapt some existing functionalities and create new ones to meet their application’s needs. They can simulate whole new scenarios, whether indoors or outdoors, and use Unreal Engine for perspective or stereo views to visualize 3D scenes.

We run Unreal Engine on ordinary Windows desktop machines with DirectX-capable graphics cards such as Nvidia GeForce or ATI Radeon. For multiuser simulations, we assign one computer as a server and the others as clients. The person as-signed to supervise an evacuation procedure oper-ates the server computer, initiating an alarm that each networked computer hears. The supervisor can see the whole scene at his or her computer or on a 2 m × 3 m stereo-projection screen in the Laboratório de Realidade Virtual (LABRV, Virtual Reality Laboratory). In this way, the supervisor can evaluate the overall simulation performance or the performance of any participant.

Evacuation-scenarioapplicationWe began developing our evacuation-scenario appli-cation in 2006 and reported initial results in 2007.4

Virtual modelingWe modeled an IEN building in 3D. The four-floor model uses real dimensions collected from archi-tectural data. It includes doors and stairs as well as some furniture.

We modeled objects with static meshes that we made with CAD software, then imported into Unreal Engine’s editor, UnrealEd, by adding or subtracting polygons and 3D simple geometries such as cubes and cylinders. We added textures obtained from photos of the real environment. Altogether, we made 27 static meshes that we can reuse as many times as needed in the 3D environ-ment map.

We adjusted Unreal Engine’s dimension unit (points) to keep avatar and building dimensions proportional. The equivalence is 1 m equals 52.5 points. Figure 1 shows an external view of the building, during the modeling stage.

We added shading to the model through light points available in Unreal Engine for this purpose. The application’s real-time performance is based on two Unreal Engine’s functionalities. First, a bi-nary space-partitioning method organizes the 3D

map in a tree data structure that improves render-ing. Second, z-buffers support 3D scene analysis from the avatar’s point of view (camera position), identifying the surfaces and objects near the cam-era and assigning higher rendering quality to them, while reducing the quality for the surfaces and objects farther from the camera.

Unreal Engine modificationsWe created several new functionalities that we implemented for our application. For example, we created a timer that we implemented for each ava-tar. The timer lets us measure the time spent dur-ing an evacuation simulation, so we can compare performance for use in our tests.

We also had to implement or adjust many oth-ers functionalities. For example, we implemented a sound alarm, which the supervisor initiates at the server to begin an evacuation simulation. Us-ers hear this alarm at their own computers.

We also adjusted the original avatar velocities—both walking and running—to more realistic values. The velocities are set as in Unreal Engine’s code. The original value for the running variable was 600, and the walking variable was half that value. We had to decrease the running velocity to 112 and the walking velocity to 56, suggesting how much faster game avatars typically are than humans. The latter value represents a typical walking velocity of 1.5 m per second. Our evacuation simulations use only the walking velocity, which is less likely to cause accidents during emergency procedures.

We also created the head-up display. The HUD pops up in the lower left part of the main simu-lation window when the user presses a button. It displays useful information—the timer for the evacuation simulation application, but it can dis-play other information for other applications.

Figure1.TheIENbuilding’svirtualmodel.ThemodelusesrealdimensionscollectedfromarchitecturaldataandmodeledobjectsimportedintotheUnrealEngine’seditor,UnrealEd.

� May/June2008

Applications

Figure 2 shows the original Unreal Engine classes in blue and the modified classes in red. Table 1 gives more detail on the modified classes. Strictly speaking, we don’t modify the original classes

but extend them with new classes that offer new functionalities.

We’ve made these modifications with no ma-jor difficulties, although we had to modify the config files for some of them. So far, we’ve been able to implement all the functionality we needed through these modifications.

ResultsforrealandvirtualsimulationsFigure 3 shows a simulation screen shot. The view corresponds to a third person’s view of the avatar. The timer is displayed at the left lower part of the window.

Figure 4 presents two photos of the LABRV pro-jection screen during a simulation of a crowded-stair situation. Figure 4a shows the two projections composing the 3D stereo image and some people from our staff using polarized glasses. Figure 4b is

Object

Actor

ControllerInfoHUDBrush

VolumeRVPawn

RVBoy RVGirl RVGameInfo

RVScoreBase

RVHUDEvac

RVScoreEvac

RVevacInfo

PhysicsVolume

RVAlarme

Game Replication

Info

RVGame Replication

Info

RVPlayerController

RVHUD ScoreBoard

GameInfo

ReplicationInfo

PlayerController

RVMapSoundPawn

Figure2.UnrealEngineclassesscheme.Theblueboxesindicatepreviouslyexistingclassesthatstayedthesame;theredboxesindicatemodifiedclasses.

Figure3.Athird-personviewofanavatarwalkingtowardthestairsinthesimulatedbuilding.

Figure4.EvacuationsimulationprojectionsinLABRV’sscreen:(a)stereoprojectionand(b)monoprojection.Thesimulationshowsacrowded-stairsituation.

(a) (b)

IEEEComputerGraphicsandApplications �

Table1.ModificationstoUnrealEngineclasses(continuedonp.10).

Class Description Variables Functions

RVPawn.uc: extends Pawn

Contains avatar-related definitions, such as animation, velocities, collision, 3D model, dresses, and abilities.

Doesn’t include all animation and collision variables because it is an interface with RVBoy and RVGirl. Main variables are related to the avatar’s dressing: skinAtual defines the default dress; skinAlternativos defines alternative dresses.

Besides inherited functions, includes two others—called by RVPlayerController—that can change the avatar’s dress: rodarSkin and mudarSkin.

RVBoy.uc and RVGirl.uc: extend RVPawn

Complement RVPawn with specific definitions for the male and female avatar models, respectively.

Don’t have their own variables, but change inherited ones. In RVGirl, the collision’s height and width are different because the female avatar is smaller than the male one.

Don’t have their own functions, and don’t change inherited ones.

RVMapSound.uc: extends Actor.

Added to the map in UnrealEd to emit a predefined sound that is initiated or stopped by RVAlarme

Doesn’t have its own variables, but changes inherited ones. The most important variables are related to the emitted sound, the sound volume, and how far the sound propagates.

Called from RVAlarme: ligar initiates the sound; desligar stops the sound.

RVAlarme.uc: extends Volume.

Implements a volume that can be played; initiates or stops the alarm represented by RVMapSound.

RVLigar (Boolean) is configured in UnrealEd. If it’s True, when an avatar enters the volume associated with the alarm, RVAlarme will activate all RVMapSound found in the map; otherwise, it will stop them.

Changes the function Touch to initiate or to stop RVMapSound.

RVHUD.uc: extends HUD. Presents information over the main window, such as the chronometer.

PlayerOwner defines HUD instance’s owner; HUDRelogio saves the clock image shown at the chronometer’s side; mostrarRelogio (Boolean) shows the chronometer.

DrawHUD draws what is needed in the screen and converts time to the format hh:mm:ss.

RVHUDEvac: extends RVHUD.

Complements RVHUD, forcing the chronometer exhibition.

Doesn’t have its own variables, but switches the variable mostrarRelogio to True.

Doesn’t have its own functions, and doesn’t change the inherited ones.

RVScoreBase.uc: extends ScoreBoard.

Exhibits a basic panel that shows the scene participants’ names when requested (by pressing the key F1); additional information can be added.

Some variables control the exhibition in the panel.

UpdateScoreBoard stands in loop and, at each second, generates and exhibits the panel.

10 May/June2008

Applications

a mono image to show a clearer view of the simu-lated screen.

We have performed both virtual and corre-sponding real simulations for a predefined exit route with no obstacles. Our tests include virtual and corresponding real simulations for a compara-tive analysis of times spent in both simulations.

In real simulations, real people perform the evacuation scenario, walking in the real building. The exit ways are defined prior to the evacuation in terms of initial and final points, and the path is defined by the corridors, stair, and doors. The final point is also called the meeting point. It’s located outside the building, where everyone must end up. People walk in a natural way searching for shorter

paths, as they would do in real evacuations. Each person counts his or her own evacuation times with chronometers, registering the total time spent when reaching the meeting point.

In virtual simulations, avatars perform the evac-uation scenario in Unreal Engine’s virtual environ-ment. At this time, the avatars aren’t autonomous, so each one is controlled by a user at an individual computer. Each user sees his or her own avatar and all the others as well. Avatars can interact with each other and experience collision effects. During simulations, each user controls his or her avatar to walk in a natural way, simulating realistic behav-iors. Decisions in the virtual simulations follow hu-man cognition, as happens in real situations—for

Table1.ModificationstoUnrealEngineclasses(continuedfromp.9).

Class Description Variables Functions

RVScoreEvac: extends RVScoreBase.

Just for embedding purposes because ScoreBoard can’t be used directly; it is only declared.

Doesn’t have its own variables.

Doesn’t have its own functions.

RVGameInfo.uc: extends GameInfo.

Points to the classes that will be used during simulation; also indicates the type of simulation that will be performed.

Doesn’t have its own variables, but changes inherited ones that point to RVHUD, RVScoreBoard, and RVPlayer Controller;

Doesn’t have its own functions, only inherited ones.

RVEvacInfo.uc: extends RVGameInfo.

Has specific functions for evacuation simulation.

Doesn’t have its own variables, just changes the existing ones to point to RVHUDEvac and RVScoreEvac.

Doesn’t have its own functions, and doesn’t change the inherited ones.

RVGameReplicationInfo.uc: extends Game ReplicationInfo.

Controls the chronometer during networked simulations; can be used to stop it.

ContandoTempo (Boolean) if True, chronometer runs; otherwise it stops.

Cronometrar switches the value of variable ContandoTempo

RVPlayerController.uc: extends PlayerController.

Interfaces the user with his/her avatar; receives commands from the console and interacts with the mouse and the keyboard.

Doesn’t have its own variables, only inherited ones.

Called from the user’s console: zerarCronometro initializes the chronometer to zero (depends on RVPlayerReplicationInfo); cronometrar starts or stops the chronometer (depends on RVPlayerReplicationInfo); mudarRoupa changes the avatar’s dressing (depends on RVPawn); escolherRoupa, followed by a number, specifies a specific dress to be used (depends on RVPawn); tocarAlarme initializes the alarms in the map (depends on RVMapSound); pararAlarme stops these alarms (depends on RVMapSound).

IEEEComputerGraphicsandApplications 11

example, an avatar seeing other avatars that have turned back from a closed door will assume the door is blocked and act appropriately. In virtual simulations, users can see their time counter on their computer screen; they must register the total time spent to reach the meeting point.

Figure 5 shows an external view of the simu-lated building and the meeting point (the red sign at left).

We have performed some tests with one person evacuating at a time and with a few people evacu-ating simultaneously. Tables 2 and 3 show these results for the cases with only one and with three people exiting simultaneously. A typical run takes around one minute, for a predefined exit, both for one and for three people evacuating at a time.

The results closely agree and are repeatable, with low variability. We found greater deviations in real evacuation times, not in the virtual ones, so the virtual simulation results are also repeatable, with low variability.

Despite the good results, we’ve noticed that peo-ple need some training to play the game, because the keyboard and the mouse control the avatars. People have to move the mouse to put an avatar in the desired walk direction. They have to press the left button to move on and use the left, right, up, and down arrow keys to walk to the sides or back, for example. Sometimes they have to use the mouse and the keys simultaneously (as when opening a door, for example).

Gamers are usually young people who are well accustomed to this kind of task and behave well in using a keyboard and mouse. But this might not be the case for people who aren’t accustomed to playing computer games. So, they need some training to get accustomed to such tasks, or we might decide to have only accustomed people in our simulations, to avoid difficulties. Our kind of simulations don’t push the requirements further.

We’ve noticed that some difficulties arise when more than one avatar tries to exit through a door. An avatar’s collision volume is defined as a cyl-inder. Thus, users notice their avatar is colliding with another one by either visualizing the other avatar near it or feeling that the controlled avatar isn’t walking exactly in the desired direction, but is instead deviating around the collided avatar.

Thus, depending on the door’s dimensions, not all avatars can exit through it simultaneously. You might have to wait a bit for the others to cross it, before moving on. Figure 6 (next page) shows such a crowded exit situation, with the nearest avatar’s third-person view. When several avatars are near a door, they can block the exit. This can keep other avatars from going immediately through the door. Instead, they must wait their turn.

Figure5.Anexternalviewofthesimulatedbuilding.Theredsignshowstheevacuationmeetingpoint.

Table2.Exittimes(inseconds)duringvirtualandrealsimulations,foronepersonatatime.

Person

Exittimesinvirtualsimulations

Exittimesinrealsimulations

1 00:54 00:54

1 00:54 00:53

1 00:54 00:58

1 00:55 00:54

1 00:55 00:56

1 00:54 00:54

1 00:55 00:59

1 00:54 00:58

1 00:54 00:55

1 00:54 00:56

Average times 00:54 00:56

Table3.Exittimes(inseconds)duringvirtualandrealsimulations,forthreepersonsatthesametime.

Person

Exittimesinvirtualsimulations

Exittimesinrealsimulations

1 01:00 01:07

2 00:56 01:05

3 00:57 01:03

1 01:00 01:07

2 00:56 01:00

3 00:54 00:59

1 00:58 01:05

2 00:55 01:04

3 00:57 01:02

12 May/June2008

Applications

Except for the case when more than one ava-tar tries to exit through a door at the same time, no difficulties arise when collision occurs. If more than one avatar collides when walking through a circulation area, for example, whether in the same

direction or not, they just deviate and continue to walk.

The doors open when an avatar reaches near it. When a door opens, it actually pushes the avatar back. But knowing this, the user who is control-ling the avatar can move back just a bit, letting the door open completely, so as not to be standing between the door and the wall.

Thus, more training is needed to perform this task, and again, the use of more friendly interfaces can improve navigation.

We expect to perform further experiments with more user-friendly interfaces than a keyboard

and mouse. We think joysticks would be sufficient for friendlier navigation—there would be no need for devices such as a 3D mouse or Wiimote.

We’re planning to use autonomous avatars, or bots, that will follow avatars controlled by a few people. We will then need only a few well-trained people to simulate evacuations of crowded envi-ronments and evaluate, for example, the number of people at which the virtual and real simulation results diverge.

AcknowledgmentsWe acknowledge a member of our staff, Douglas S. Sales, and also the undergraduate students, who par-ticipated during the virtual modeling stage: Felipe M. Botelho, Daniel M. Moreira, Felipe R. Bastos, and Beatriz A.R. Oliveira.

ReferencesM. Lewis and J. Jacobson, “Introduction: Game Engines in Scientific Research,” Comm. ACM, vol . 45, no. 1, 2002, pp. 27–31.A. Rosenbloom, “Introduction: A Game Experience in Every Application,” Comm. ACM, vol. 46, no. 7, 2003, pp. 28–31.M. Zyda, “Introduction: Creating a Science of Games,” Comm. ACM, 2007, vol. 50, no. 7, 2007, pp. 26–29.A.C.A. Mól et al., “Virtual Environment Simulation as a Tool to Support Evacuation Planning,” Proc. 2007 Int’l Nuclear Atlantic Conf. (INAC 07), Associação Brasileira de Energia Nuclear, DVD-ROM; www. inac2007.com.br/dvd/pdf_dvd/R18_979.pdf.

Contact authors Antônio Carlos A. Mól and Carlos Alexandre F. Jorge at {mol, calexandre}@ien.gov.br.

Contact Applications department editor Mike Potel at [email protected].

1.

2.

3.

4.

Figure6.Acrowdedexitdoor’sview.Thissituationkeepsapproachingavatarsfromgoingthroughthedooruntilthefirstcrowdhasexited.

Check out these two upcoming issues:

IEEE Pervasive Computing

April-June issue on Activity-Based Computing

www.computer.org/pervasive

IEEE Intelligent Systems

March/April issue on Ambient Intelligence

www.computer.org/intelligent