10.1.1.63.4105

Upload: hari-varrier

Post on 14-Apr-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/30/2019 10.1.1.63.4105

    1/11

    Introduction to Augmented RealityR. Silva, J. C. Oliveira, G. A. Giraldi

    National Laboratory for Scientific Computation,Av. Getulio Vargas, 333 - Quitandinha - Petropolis-RJ

    Brazilrodrigo,jauvane,[email protected]

    ABSTRACT

    This paper presents an overview of basic aspects of Augmented Reality (AR) and the mainconcepts of this technology. It describes the main fields in which AR is applied nowadaysand important AR devices. Some characteristics of Augmented Reality systems will bediscussed and this paper will provide an overview of them. Future directions are discussed.

    Keywords: Augmented Reality, Virtual Reality, Scientific Visualization

    1 INTRODUCTION

    Augmented Reality (AR) is a new technologythat involves the overlay of computer graph-ics on the real world (Figure 1). One of thebest overviews of the technology is [4], thatdefined the field, described many problems,and summarized the developments up to thatpoint. That paper provides a starting pointfor anyone interested in researching or usingAugmented Reality.

    AR is within a more general context termedMixed Reality (MR) [20], which refers toa multi-axis spectrum of areas that coverVirtual Reality (VR), AR, telepresence, andother related technologies.

    Virtual Reality is a term used for computer-generated 3D environments that allow theuser to enter and interact with synthetic en-vironments [9][27][28]. The users are able toimmerse themselves to varying degrees inthe computers artificial world which may ei-

    ther be a simulation of some form of real-ity [10] or the simulation of a complex phe-nomenon [33][9].

    Figure 1: AR example with virtualchairs and a virtual lamp.

    In telepresence, the fundamental purpose isto extend operators sensory-motor facilities

    and problem solving abilities to a remote en-vironment. In this sense, telepresence can bedefined as a human/machine system in whichthe human operator receives sufficient infor-mation about the teleoperator and the taskenvironment, displayed in a sufficiently nat-ural way, that the operator feels physicallypresent at the remote site [26]. Very similarto virtual reality, in which we aim to achievethe illusion of presence within a computersimulation, telepresence aims to achieve the

    illusion of presence at a remote location.

    AR can be considered a tecnology between

    1

  • 7/30/2019 10.1.1.63.4105

    2/11

    VR and telepresence. While in VR the envi-ronment is completely synthetic and in telep-resence it is completely real, in AR the usersees the real world augmented with virtualobjects.

    When designing an AR system, three aspectsmust be in mind: (1) Combination of real andvirtual worlds; (2) Interactivity in real time;(3) Registration in 3D.

    Wearable devices, like Head-Mounted-Displays (HMD) [28], could be used to showthe augmented scene, but other technologies

    are also available [4].

    Besides the mentioned three aspects, anotherone could be incorporated: Portability. Inalmost all virtual environment systems, theuser is not allowed to go around much due todevices limitations. However, some AR ap-plications will need that the user really walksthrough a large environment. Thus, portabil-ity becomes an important issue.

    For such applications, the 3D registration be-comes even more complex. Wearable com-puting applications generally provide unreg-istered, text/graphics information using amonocular HMD. These systems are moreof a see-around setup and not an Aug-mented Reality system by the narrow defini-tion. Henceforth, computing platforms andwearable display devices used in AR must be

    often developed for more general applications(see section 3).

    The field of Augmented Reality has existedfor just over one decade, but the growth andprogress in the past few years has been re-markable [12]. Since [4], the field has grownrapidly. Several conferences specialized inthis area were started, including the Inter-national Workshop and Symposium on Aug-

    mented Reality, the International Sympo-sium on Mixed Reality, and the DesigningAugmented Reality Environments workshop.

    2 AR Components

    2.1 Scene Generator

    The scene generator is the device or softwareresponsible for rendering the scene. Render-

    ing is not currently one of the major problemsin AR, because a few virtual objects need tobe drawn, and they often do not necessarilyhave to be realistically rendered in order toserve the purposes of the application [4].

    2.2 Tracking System

    The tracking system is one of the most impor-tant problems on AR systems mostly because

    of the registration problem [3]. The objectsin the real and virtual worlds must be prop-erly aligned with respect to each other, orthe illusion that the two worlds coexist willbe compromised. For the industry, many ap-plications demand accurate registration, spe-cially on medical systems [16][4].

    2.3 Display

    The tecnology for AR is still in developmentand solutions depend on design decisions.Most of the Displays devices for AR are HMD(Head Mounted Display), but other solutionscan be found (see section 3).

    When combining the real and virtual worldtwo basic choices are available: optical andvideo technology. Each of them has sometradeoffs depending on factors like resolution,flexibility, field-of-view, registration strate-

    gies, among others [4].

    Display technology continues to be a limit-ing factor in the development of AR systems.There are still no see-through displays thathave sufficient brightness, resolution, field ofview, and contrast to seamlessly blend a widerange of real and virtual imagery. Further-more, many technologies that begin to ap-proach these goals are not yet sufficientlysmall, lightweight, and low-cost. Neverthe-

    less, the past few years have seen a numberof advances in see-through display technol-ogy, as we shall see next.

    2

  • 7/30/2019 10.1.1.63.4105

    3/11

    3 AR Devices

    Four major classes of AR can be distin-guished by their display type: Optical See-Through, Virtual Retinal Systems, Video

    See-Through, Monitor Based AR and Pro-jector Based AR.

    The following sections show the correspond-ing devices and present their main features.

    3.1 Optical See-Through HMD

    Optical See-Through AR uses a transparentHead Mounted Display to show the virtualenvironment directly over the real world (Fig-ures 2 and 3). It works by placing opticalcombiners in front of the users eyes. Thesecombiners are partially transmissive, so thatthe user can look directly through them tosee the real world. The combiners are alsopartially reflective, so that the user sees vir-tual images bounced off the combiners fromhead-mounted monitors.

    Prime examples of an Optical See-throughAR system are the various augmented medi-cal systems. The MIT Image Guided Surgeryhas concentrated on brain surgery [15]. UNChas been working with an AR enhanced ultra-sound system and other ways to superimposeradiographic images on a patient [23]. Thereare many other Optical See-through systems,as it seems to be the main direction for AR.

    Despite of these specific examples, there is

    still a lack of general purpose see-throughHMDs. One issue for Optical See-throughAR is the alignment of the HMD optics withthe real world. A good HMD allows adjust-ments to fit the eye position and comfort ofindividual users. It should also be easy tomove it out of the way when not needed.However, these movements will alter the reg-istration of the VE over the real world andrequire re-calibration of the system. An ex-pensive solution would be to instrument the

    adjustments, so the system could automagi-cally compensate for the motion. Such de-vices are not reported in the literature.

    Figure 2: Optical See-Through HMD.

    Figure 3: Optical See-Through Scheme.

    Recent Optical See-Through HMDs are be-ing built for well-known companies like Sonyand Olympus and have support for occlusion,varying accommodation (process of focusingthe eyes on objects at a particular distance).

    There are very small prototypes that can beattached to conventional eyeglasses (Figure4).

    Figure 4: Eyeglass display with holo-graphic element.

    3.2 Virtual Retinal Systems

    The VRD (Virtual Retinal Display) was in-vented at the University of Washington inthe Human Interface Technology Lab (HIT)in 1991. The aim was to produce a fullcolor, wide field-of-view, high resolution, high

    brightness, low cost virtual display. Microvi-sion Inc. has the exclusive license to com-mercialize the VRD technology (Figure 5).

    3

  • 7/30/2019 10.1.1.63.4105

    4/11

    This technology has many potential applica-tions, from head-mounted displays (HMDs)for military/aerospace applications to medi-cal purposes.

    The VRD projects a modulated beam of light(from an electronic source) directly onto theretina of the eye producing a rasterized im-age (Figure 6). The viewer has the illusionof seeing the source image as if he/she standstwo feet away in front of a 14-inch monitor.In reality, the image is on the retina of its eyeand not on a screen. The quality of the im-age he/she sees is excellent with stereo view,full color, wide field of view and no flickeringcharacteristics [13][24].

    Figure 5: Virtual Retinal System HMD.

    Figure 6: Virtual Retinal System Scheme.

    3.3 Video See-Through HMD

    Video See-Through AR uses an opaque HMDto display merged video of the VE and viewfrom cameras on the HMD (Figure 7).

    This approach is a bit more complex than

    optical see-through AR, requiring proper lo-cation of the cameras (Figure 8). However,video composition of the real and virtual

    worlds is much easier. There are a variety ofsolutions available including chroma-key anddepth mapping. Mixed Reality Systems Lab(MRSL) of Japan presented a stereo videosee-through HMD at ISAR 2000. This de-

    vice addresses some of the parallax relatedto location of the cameras vs eyes.

    Figure 7: Video See-Through HMD.

    Figure 8: Video See-Through Scheme.

    3.4 Monitor Based

    Monitor Based AR also uses merged videostreams but the display is a more conven-tional desktop monitor or a hand held dis-

    play. It is perhaps the least difficult ARsetup, as it eliminates HMD issues. Prince-ton Video Image, Inc. has developed a tech-nique for merging graphics into real timevideo streams. Their work is regularly seenas the first down line in American footballgames. It is also used for placing advertisinglogos into various broadcasts.

    3.5 Projection Displays

    Projector Based AR uses real world objectsas the projection surface for the virtual envi-

    4

  • 7/30/2019 10.1.1.63.4105

    5/11

    Figure 9: Monitor Based Scheme.

    Figure 10: Monitor Based Example.

    ronment (Figures 11,12).

    It has applications in industrial assembly,product visualization, etc. Projector basedAR is also well suited to multiple user situa-tions. Alignment of projectors and the pro-

    jection surfaces is critical for successful appli-cations.

    Figure 11: Projector Based AR.

    4 Applications

    The Augmented Reality technology has manypossible applications in a wide range offields, including entertainment, education,medicine, engineering and manufacturing.

    It is expected that other potential areas ofapplications will appear with the dissemina-tion of this technology.

    Figure 12: Projector Based AR.

    4.1 Medical

    Because imaging technology is so pervasivethroughout the medical field, it is not surpris-

    ing that this domain is viewed as one of themore important for augmented reality sys-tems. Most of the medical applications dealwith image guided surgery (Figure 13) [15].

    Figure 13: Image Guided surgery.

    Pre-operative imaging studies of the patient,such as CT (Computed Tomography) or MRI(Magnetic Resonance Imaging) scans, pro-vide the surgeon with the necessary view ofthe internal anatomy. From these images thesurgery is planned.

    Visualization of the path through theanatomy of the affected area (where a tumormust be removed, for example) is done byfirst creating a 3D model from the multipleviews and slices in the pre-operative study.The model is then projected over the targetsurface to help the surgical procedure.

    Augmented reality can be applied so that thesurgical team can see the CT or MRI data

    correctly registered on the patient in the op-erating theater while the procedure is pro-gressing. Being able to accurately register

    5

  • 7/30/2019 10.1.1.63.4105

    6/11

    the images at this point will enhance theperformance of the surgical team and elim-inate the need for the painful and cumber-some stereotactic frames that are currentlyused for registration [15].

    Another application for augmented reality inthe medical domain is in ultrasound imaging[2]. Using an optical see-through display theultrasound technician can view a volumetricrendered image of the fetus overlaid on theabdomen of the pregnant woman. The imageappears as if it were inside of the abdomenand is correctly rendered as the user moves[25] (Figure 14).

    Figure 14: Ultrasound Imaging.

    4.2 Entertainment

    A simple form of augmented reality has beenin use in the entertainment and news busi-ness for quite some time. Whenever youare watching the evening weather report, thespeaker remains standing in front of chang-ing weather maps. In the studio the re-porter is actually standing in front of a blue

    screen. This real image is augmented withcomputer generated maps using a techniquecalled chroma-keying. Another entertain-ment area where AR is being applied is ongame development [31] (Figure 15 and 16).

    Princeton Electronic Billboard has devel-oped an augmented reality system that al-lows broadcasters to insert advertisementsinto specific areas of the broadcast image(Figure 17). For example, while broadcasting

    a baseball game this system would be able toplace an advertisement in the image so thatit appears on the outfield wall of the stadium.

    Figure 15: Games using a virtual tableand synthetic objects.

    Figure 16: VR-Border Guards, an AR game

    The electronic billboard requires calibrationto the stadium by taking images from typi-cal camera angles and zoom settings in orderto build a map of the stadium including thelocations in the images where advertisementswill be inserted. By using pre-specified refer-ence points in the stadium, the system auto-matically determines the camera angle beingused and referring to the pre-defined stadiummap inserts the advertisement into the cor-rect place.

    Figure 17: Advertisement on a Footballgame.

    6

  • 7/30/2019 10.1.1.63.4105

    7/11

    4.3 Military Training

    The military has been using displays in cock-pits that present information to the pilot onthe windshield of the cockpit or the visor ofthe flight helmet (Figure 18). This is a formof augmented reality display.

    By equipping military personnel with hel-met mounted visor displays or a special pur-pose rangefinder the activities of other unitsparticipating in the exercise can be imaged.While looking at the horizon, during a train-ing section for example, the display equippedsoldier could see a virtual helicopter risingabove the tree line. This helicopter could bebeing flown in simulation by another partici-

    pant. In wartime, the display of the real bat-tlefield scene could be augmented with anno-tation information or highlighting to empha-size hidden enemy units [32].

    Figure 18: Military Training.

    4.4 Engineering Design

    Imagine that a group of designers are workingon the model of a complex device for theirclients.

    The designers and clients want to do a jointdesign review even though they are physicallyseparated. If each of them had a conferenceroom that was equipped with an augmentedreality display this could be accomplished.

    The physical prototype that the designershave mocked up is imaged and displayedin the clients conference room in 3D. Theclients can walk around the display looking

    at different aspects of it. To hold discussionsthe client can point at the prototype to high-light sections and this will be reflected on the

    real model in the augmented display that thedesigners are using. Or perhaps in an ear-lier stage of the design, before a prototypeis built, the view in each conference room isaugmented with a computer generated image

    of the current design built from the CAD filesdescribing it [1] (Figure 19).

    Figure 19: AR applied to EngineeringDesign. This figure shows a real objectaugmented with virtual tubes.

    4.5 Robotics and Telerobotics

    In the domain of robotics and telerobotics anaugmented display can assist the user of thesystem [17][21].

    A telerobotic operator uses a visual image ofthe remote workspace to guide the robot. An-notation of the view would be useful as it iswhen the scene is in front of the operator.Besides, augmentation with wireframe draw-ings of structures in the view can facilitate

    visualization of the remote 3D geometry.

    If the operator is attempting a motion itcould be practiced on a virtual robot thatis visualized as an augmentation to the realscene. The operator can decide to pro-ceed with the motion after seeing the results.The robot motion could then be executeddirectly which in a telerobotics applicationwould eliminate any oscillations caused by

    long delays to the remote site. Another useof robotics and AR is on remote medical op-eration (Figures 20 and 21).

    7

  • 7/30/2019 10.1.1.63.4105

    8/11

    Figure 20: Virtual surgery using robotarms.

    Figure 21: Robotics using AR for re-mote medical operation.

    4.6 Manufacturing, Maintenance and

    Repair

    When the maintenance technician ap-proaches a new or unfamiliar piece ofequipment instead of opening several repair

    manuals they could put on an augmentedreality display. In this display the image ofthe equipment would be augmented withannotations and information pertinent to therepair. For example, the location of fastenersand attachment hardware that must beremoved would be highlighted (Figure 22).

    Boing made an experimental system, wherethe technicians are guided by the augmenteddisplay that shows the routing of the cables

    on a generic frame used for all harnesses (Fig-ure 23). The augmented display allows a sin-gle fixture to be used for making the multiple

    Figure 22: AR used to aid mechanical work.

    harnesses [29].

    Figure 23: AR applied to maintenancework.

    4.7 Collaborative AR

    AR addresses two major issues with collab-oration: seamless integration with existingtools and practices, and enhancing practiceby supporting remote and co-located activi-ties that would otherwise be impossible.

    Collaborative AR systems have been built us-ing projectors, hand-held and head-worn dis-plays. By using projectors to augment thesurfaces in a collaborative environment, users

    are unencumbered, can see each others eyes,and are guaranteed to see the same augmen-tations [6].

    Examples of collaborative AR systems usingsee-through displays include both those thatuse see-through handheld displays and see-through head-worn displays [14] (Figure 24).

    5 Visualization Issues

    Researchers have begun to address problemsin displaying information in AR, caused by

    8

  • 7/30/2019 10.1.1.63.4105

    9/11

    Figure 24: The Studierstube collabora-tive AR system.

    the nature of AR technology or displays.Work has been done in the correction of reg-istration errors and avoiding hiding critical

    data due to density problems.

    5.1 Visualization Errors

    In some AR systems, registration errors aresignificant and unavoidable. For example,the measured location of an object in theenvironment may not be known accuratelyenough to avoid visible registration error.Under such conditions, one approach for ren-dering an object is to visually display the areain screen space where the object could re-side, based upon expected tracking and mea-surement errors [19]. This guarantees thatthe virtual representation always contains thereal counterpart.

    Another approach when rendering virtual ob-jects that should be occluded by real objectsis to use a probabilistic function that gradu-ally fades out the hidden virtual object alongthe edges of the occluded region, making reg-istration errors less objectionable [11].

    5.2 Removing real objects from the

    environment

    The problem of removing real objects is morethan simply extracting depth informationfrom a scene. The system must also be able

    to segment individual objects in that environ-ment. A semi-automatic method for identi-fying objects and their locations in the scene

    through silhouettes is found in [18]. This en-ables the insertion of virtual objects and dele-tion of real objects without an explicit 3D re-construction of the environment (Figure 25).

    Figure 25: Virtual/Real occlusions.

    The brown cow and tree are virtual(the rest is real)

    5.3 Photorealistic Rendering

    A key requirement for improving the render-ing quality of virtual objects in AR applica-tions is the ability to automatically capturethe environmental illumination information

    [7][22][30].

    For example, in [8] it is presented a methodthat, using only an uncalibrated camera, al-lows the capture of object geometry and ap-pearance, and then, at a later stage, render-ing and AR overlay into a new scene.

    6 Conclusions and Future Work

    Despite of the many recent advances in AR,much work remains to be done. Applica-tion developments can be helped by using theavailable libraries. One of them is ARToolkit[5], that provides computer vision techniquesto calculate a cameras position and orienta-tion relative to marked cards so that virtual3D objects can be overlaid precisely on themarkers.

    Here are some areas requiring further re-search if AR is to become commonly de-ployed.

    9

  • 7/30/2019 10.1.1.63.4105

    10/11

    Ubiquitous tracking and system portabil-ity: Several impressive AR demonstra-tions have generated compelling environ-ments with nearly pixel-accurate registra-tion. However, such demonstrations work

    only inside restricted, carefully prepared en-vironments. The ultimate goal is a trackingsystem that supports accurate registration inany arbitrary unprepared environment, in-doors or outdoors. Allowing AR systems togo anywhere also requires portable and wear-able systems that are comfortable and unob-trusive.

    Ease of setup and use: Most existing ARsystems require expert users (generally the

    system designers) to calibrate and operatethem. If AR applications are to become com-monplace, then the systems must be deploy-able and operable by non-expert users. Thisrequires more robust systems that avoid orminimize calibration and setup requirements.

    Photorealistic and advanced rendering: Al-though many AR applications only need sim-ple graphics such as wireframe outlines andtext labels, the ultimate goal is to render the

    virtual objects to be indistinguishable fromthe real ones. This must be done in real time,without the manual intervention of artistsor programmers. New techniques in imagebased rendering must be considered in orderto accomplish this task [7].

    AR in all senses: Researchers have fo-cused primarily on augmenting the visualsense. Eventually, compelling AR environ-ments may require engaging other senses aswell (touch, hearing, etc.).

    REFERENCES

    [1] K. Ahlers and A. Kramer. Distributedaugmented reality for collaborative de-sign applications. European ComputerIndustry Research Center, 3-14, 1995.

    [2] S. Andrei, D. Chen, C. Tector,

    A. Brandt, H. Chen, R. Ohbuchi,M. Bajura, and H. Fuchs. Case study:Observing a volume rendered fetus

    within a pregnant patient. Proceedingsof IEEE Visualization, 17-21, 1993.

    [3] R. Azuma. Tracking requirements foraugmented reality. Communications ofthe ACM, 36(7):50-51, 1993.

    [4] R. Azuma. A survey of augmented real-ity. ACM SIGGRAPH, 1-38, 1997.

    [5] M. Billinghurst, S. Baldis, E. Miller, andS. Weghorst. Shared space: Collabora-tive information spaces. Proc. of HCIInternational, 7-10, 1997.

    [6] M. Billinghurst and H. Kato. Mixedreality - merging real and virtualworlds. Proc. International Symposium

    on Mixed Reality (ISMR 99), 261-284,1999.

    [7] S. Boivin and A. Gagalowicz. Image-based rendering for industrial applica-tions. ERCIM News, 2001.

    [8] D. Cobzas, K. Yerex, and M. Jager-sand. Editing real world scenes: Aug-mented reality with image-based render-ing. Proc. of IEEE Virtual Reality, 291-292, 2003.

    [9] A. Van Dam, A. Forsberg, D. Laid-law, J. LaViola, and R. Simpson. Im-mersive VR for scientific visualization:A progress report. IEEE ComputerGraphics and Applications, 20(6): 26-52, 2000.

    [10] P. du Pont. Building complex virtualworlds without programming. EURO-GRAPHICS95 State Of The Art Re-

    ports, 6170, 1995.[11] A. Fuhrmann et. al. Occlusion in collab-

    orative augmented environments. Com-puters Graphics, 23 (6): 809-819, 1999.

    [12] R. Azuma et al. Recent advances in aug-mented reality. IEEE Computer Graph-ics and Applications, 20-38, 2001.

    [13] R. Chinthammit et al. Head trackingusing the virtual retinal display. Sec-

    ond IEEE and ACM International Sym-posium on Augmented Reality, 235-242,2001.

    10

  • 7/30/2019 10.1.1.63.4105

    11/11

    [14] Z. Szalavri et. al. Studierstube: An envi-ronment for collaboration in augmentedreality. Virtual Reality Systems, Devel-opment and Application, 3 (1): 37-48,1998.

    [15] W. Grimson, G. Ettinger, T. Kapur,M. Leventon, W. Wells, and R. Kikinis.Utilizing segmented MRI data in image-guided surgery. International Journal ofPattern Recognition and Artificial Intel-ligence, 11(8):1367-97, 1998.

    [16] Richard Lee Holloway. Registration er-rors in augmented reality systems. Tech-nical Report TR95-016, The Universityof North Carolina, 1 1995.

    [17] W. S. Kim and P. Schenker. An ad-vanced operator interface design withpreview/predictive displays for ground-controlled space telerobotic servicing.Proceedings of SPIE Vol. 2057: Telema-nipulator Technology and Space Teler-obotics, 96-107, 1993.

    [18] V. Lepetit and M. Berger. Handlingocclusions in augmented reality sys-tems: A semi-automatic method. Proc.

    Int.l Symp. Augmented Reality, 137-146,2000.

    [19] B. MacIntyre and E. Coelho. Adapt-ing to dynamic registration errors usinglevel of error (loe) filtering. Proc. Int.lSymp. Augmented Reality, 85-88, 2000.

    [20] P. Milgram and F. Kishino. A taxon-omy of mixed reality visual displays. IE-ICE Transactions on Information Sys-tems, E77-D (12): 1321-1329, 1994.

    [21] P. Milgram and S. Zhai. Applicationsof augmented reality for human-robotcommunications. Proceedings of 1993IEEE/RSJ International Conference onIntelligent Robots and Systems, 1467-1476, 1993.

    [22] Y. Mukaigawa, S. Mihashi, andT. Shakunaga. Photometric image-based rendering for virtual lighting

    image synthesis. Proc. 2nd Int.l Work-shop Augmented Reality (IWAR 99),115-124, 1999.

    [23] Ryutarou Ohbuchi, David Chen, andHenry Fuchs. Incremental volume re-construction and rendering for 3D ul-trasound imaging. Visualization inBiomedical Computing, 1808: 312-323,

    1992.[24] H.L. Pryor, T.A. Furness, and E. Vi-

    irre. The virtual retinal display: A newdisplay technology using scanned laserlight. Proc. 42nd Human Factors Er-gonomics Society, pp. 1149, 1998.

    [25] F. Sauer, A. Khamene, B. Bascle,L. Schimmang, F. Wenzel, and S. Vogt.Augmented reality visualization of ultra-sound images: System description, cal-

    ibration and features. IEEE and ACMInternational Symposium on AugmentedReality, 30-39, 2001.

    [26] T.B. Sheridan. Musing on telepres-ence and virtual presence. Presence,1(1):120-125, 1992.

    [27] W. Sherman and A. Craig. Understand-ing Virtual Reality: Interface, Applica-tions and Design. Morgan KaufmannPublishers, 2003.

    [28] R. Silva and G. Giraldi. Introductionto virtual reality. Technical Report:06/2003, LNCC, Brazil, 2003.

    [29] D. Sims. New realities in aircraft de-sign and manufacture. IEEE ComputerGraphics and Applications, 14 (2): 91,1994.

    [30] J. Stauder. Augmented reality with au-tomatic illumination control incorporat-ing ellipsoidal models. IEEE Trans.Multimedia, 1 (2): 136-143, 1999.

    [31] Z. Szalavri, E. Eckstein, and M. Ger-vautz. Collaborative gaming in aug-mented reality. VRST, Taipei, Taiwan,195204, 1998.

    [32] E. Urban. The information warrior.IEEE Spectrum, 32 (11): 66-70, 1995.

    [33] D. Weimer. Frontiers of Scientific Visu-alization, chapter 9, Brave New Virtual

    Worlds., pages 245278. John Wileyand Sons, Inc., 1994.

    11