utilization of ubiquitous computing for construction ar technology

7
Utilization of ubiquitous computing for construction AR technology Do Hyoung Shin a , Won-Suk Jang b, a Department of Civil Engineering, Inha University, 253 Yonghyun-Dong, Nam-Gu, Incheon, South Korea b Department of Civil Engineering, Yeungnam University, 214-1 Dae-Dong, Gyeongsan-Si, Gyeongsangbuk-Do, 712-749, South Korea abstract article info Article history: Accepted 1 June 2009 Keywords: Augmented reality Ubiquitous computing Construction Motion tracking Wireless sensor network Based on the investigation of the characteristics of construction sites in terms of AR technology suitability, this paper introduces the application of a new framework of ubiquitous augmented reality (AR) technology (titled U-AR) to a construction site where high mobility, free from occlusion, good accuracy, and expandability into large scale applications play a key role in meeting the practical requirements. U-AR is intended to enhance accessibility to the distributed networks that provide a gateway between a physical construction site and the digital information of AR. To illustrate the U-AR system, the detailed schematic architecture of the AR display technique, motion tracking method, and server implementation is discussed throughout the preliminary studies. The proposed architectures of U-AR present a direction of what technologies regarding display, tracker, and server should be focused for the development of the compelling U-AR. It is found that the proposed U-AR is expected to be the best suitable technology to provide AR-based visual information in construction sites. © 2009 Elsevier B.V. All rights reserved. 1. Introduction Visualization is an area of high interest with respect to the application areas of computer technology in the architecture, engineer- ing, and construction (AEC) industries. By contextually visualizing construction information that would otherwise be available in an abstract or minimal fashion, project participants can understand the aspects of a project more easily and a consistent shared understanding can be achieved more readily. Augmented reality (AR) is one of the computer technologies that can provide advanced visualization tools to the AEC industry. AR is a mixed environment between real environment and virtual environment or virtual reality (VR). Real environment is a space consisting of 100% real objects, and VR creates an environment where the real world scene is entirely replaced with virtual spaces and objects. Meanwhile, AR creates an environment where virtual objects are superimposed onto a real world scene. That is, AR inserts spatially relevant digital information into a person's view of their real world surroundings. The main feature of AR, enabling users to see virtual objects and a real world scene together, has attracted the attention of AEC researchers. Several research studies have demonstrated the feasibility of AR to provide visual aids for underground structures [22], architectural maintenance information [31], outdoor architectural designs [29], design detailing [9], building damage evaluation in the aftermath of disasters [10], etc. Based on a technology suitability study, Shin and Dunston [25] showed that AR is a potential technology that can aid several work tasks on a construction site for building and inspection, coordination, and interpretation and communication. By presenting construction information in a way that is easier to perceive, AR is expected to provide more cost and labor effective methods to perform the work tasks. The problem, however, is that current AR technology has many limitations to be addressed before AR can be fully deployed on a construction site. Enabling technologies for AR include displays, tracking, registration, and calibration [3]. Among them, tracking technology continues to be one of the most prevalent challenges currently limiting AR applications. In order to provide AR systems for construction sites in a compelling way, these technologies must be appropriate and advanced for the typical characteristics of a construction site. Several studies demonstrated the feasibilities of AR systems for construction (e.g. [4,6,21]). However, most of them focuses on only one or two characteristics of construction sites and did not comprehensively consider the characteristics of construction sites in terms of AR technology suitability. Therefore, this study investigates the characteristics that can affect the format of AR systems in terms of technology suitability. Based on the investigation, this study also proposes the schematic architectures of the display technique, motion tracking method, and server implementation of the ubiquitous AR. The architectures present a direction of what tracking technologies, display techniques, and server functions should be focused for the development of the compelling ubiquitous AR environments. The following sections discuss how ubiquitous AR environments are suitable for construction sites, present their schematic architectures, and Automation in Construction 18 (2009) 10631069 Corresponding author. E-mail address: [email protected] (W.-S. Jang). 0926-5805/$ see front matter © 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.autcon.2009.06.001 Contents lists available at ScienceDirect Automation in Construction journal homepage: www.elsevier.com/locate/autcon

Upload: do-hyoung-shin

Post on 26-Jun-2016

216 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Utilization of ubiquitous computing for construction AR technology

Automation in Construction 18 (2009) 1063–1069

Contents lists available at ScienceDirect

Automation in Construction

j ourna l homepage: www.e lsev ie r.com/ locate /autcon

Utilization of ubiquitous computing for construction AR technology

Do Hyoung Shin a, Won-Suk Jang b,⁎a Department of Civil Engineering, Inha University, 253 Yonghyun-Dong, Nam-Gu, Incheon, South Koreab Department of Civil Engineering, Yeungnam University, 214-1 Dae-Dong, Gyeongsan-Si, Gyeongsangbuk-Do, 712-749, South Korea

⁎ Corresponding author.E-mail address: [email protected] (W.-S. Jang).

0926-5805/$ – see front matter © 2009 Elsevier B.V. Aldoi:10.1016/j.autcon.2009.06.001

a b s t r a c t

a r t i c l e i n f o

Article history:Accepted 1 June 2009

Keywords:Augmented realityUbiquitous computingConstructionMotion trackingWireless sensor network

Based on the investigation of the characteristics of construction sites in terms of AR technology suitability,this paper introduces the application of a new framework of ubiquitous augmented reality (AR) technology(titled U-AR) to a construction site where high mobility, free from occlusion, good accuracy, andexpandability into large scale applications play a key role in meeting the practical requirements. U-AR isintended to enhance accessibility to the distributed networks that provide a gateway between a physicalconstruction site and the digital information of AR. To illustrate the U-AR system, the detailed schematicarchitecture of the AR display technique, motion tracking method, and server implementation is discussedthroughout the preliminary studies. The proposed architectures of U-AR present a direction of whattechnologies regarding display, tracker, and server should be focused for the development of the compellingU-AR. It is found that the proposed U-AR is expected to be the best suitable technology to provide AR-basedvisual information in construction sites.

© 2009 Elsevier B.V. All rights reserved.

1. Introduction

Visualization is an area of high interest with respect to theapplication areas of computer technology in the architecture, engineer-ing, and construction (AEC) industries. By contextually visualizingconstruction information that would otherwise be available in anabstract or minimal fashion, project participants can understand theaspects of a project more easily and a consistent shared understandingcan be achieved more readily. Augmented reality (AR) is one of thecomputer technologies that can provide advanced visualization tools tothe AEC industry.

AR is a mixed environment between real environment and virtualenvironment or virtual reality (VR). Real environment is a spaceconsisting of 100% real objects, and VR creates an environment wherethe real world scene is entirely replaced with virtual spaces and objects.Meanwhile, AR creates an environment where virtual objects aresuperimposed onto a real world scene. That is, AR inserts spatiallyrelevant digital information into a person's view of their real worldsurroundings.

Themain featureofAR, enablingusers to seevirtual objects anda realworld scene together, has attracted the attention of AEC researchers.Several research studies have demonstrated the feasibility of AR toprovide visual aids for underground structures [22], architecturalmaintenance information [31], outdoor architectural designs [29],design detailing [9], building damage evaluation in the aftermath of

l rights reserved.

disasters [10], etc. Based on a technology suitability study, Shin andDunston [25] showed that AR is a potential technology that can aidseveral work tasks on a construction site for building and inspection,coordination, and interpretation and communication. By presentingconstruction information in a way that is easier to perceive, AR isexpected to provide more cost and labor effective methods to performthe work tasks.

The problem, however, is that current AR technology has manylimitations to be addressed before AR can be fully deployed on aconstruction site. Enabling technologies for AR include displays,tracking, registration, and calibration [3]. Among them, trackingtechnology continues to be one of the most prevalent challengescurrently limiting AR applications. In order to provide AR systems forconstruction sites in a compelling way, these technologies must beappropriate and advanced for the typical characteristics of aconstruction site. Several studies demonstrated the feasibilities ofAR systems for construction (e.g. [4,6,21]). However, most of themfocuses on only one or two characteristics of construction sites and didnot comprehensively consider the characteristics of construction sitesin terms of AR technology suitability. Therefore, this study investigatesthe characteristics that can affect the format of AR systems in terms oftechnology suitability. Based on the investigation, this study alsoproposes the schematic architectures of the display technique, motiontracking method, and server implementation of the ubiquitous AR.The architectures present a direction of what tracking technologies,display techniques, and server functions should be focused for thedevelopment of the compelling ubiquitous AR environments.

The following sections discuss how ubiquitous AR environments aresuitable for construction sites, present their schematic architectures, and

Page 2: Utilization of ubiquitous computing for construction AR technology

1064 D. Shin, W.-S. Jang / Automation in Construction 18 (2009) 1063–1069

explore displays, tracking systems, and servers for ubiquitous ARenvironments. The concept of ubiquitous AR environments is derivedfrom the review of the current motion tracking techniques (Section 2)and the characteristics of construction sites (Section 3).

2. Current motion tracking techniques

The main motion tracking technologies are mechanical, magnetic,acoustic, inertial, and optical. However, each of them has short-comings for accuracy, noise, or tracking range [1,15,18]. For example,mechanical trackers are very accurate, easy to build and can have lowlatency, but they provide a limited working volume and can becumbersome. Mechanical tracking systems available on the marketinclude the Body Tracker II by Puppet Works. Magnetic trackers areeasy to use, are accurate, and can have reasonable latency but they arevulnerable to distortions by metallic objects in the environment andprovide a small working volume. Polhemus produces AC electro-magnetic trackers named the LIBERTY™, PTRIOT™, and FASTRAK™.Commercial DC electromagnetic trackers include the MotionStar®and Flock of Birds® by Ascension. Inertial trackers are lightweight andhave no physical limits on the working volume, but they have erroraccumulation due to the integration of readings and experience driftin the gyroscope and accelerometer. Most commercial inertialtrackers, such as 3D-Bird of Ascension or InteriaCube3 of InterSense,are 3DOF (degree-of-freedom) for positions because inertial trackersusually p too low an orientation accuracy. Acoustic trackers areinexpensive, small, and lightweight, but they are susceptible to noiseand echo and vary with temperature, pressure, and humidity. Theyalso require a fairly unobstructed path between the sources and themicrophones. The Logitech Ultrasonic Tracker by FakeSpace Labs isone of few ultrasonic trackers available on themarket. Optical trackersare very accurate, have a good update rate, and hold great promise forlarge area tracking with minimal encumbrances. However, they aresensitive to optical noise and spurious light and suffer from line-of-sight problems. Commercial optical trackers include the StudioCamera Tracker by Motion Analysis Corporation, the Qualisys motioncapture system, and the Vicon MX system.

Hybrid-tracking techniques can be used to exploit the strengthsand compensate for the weaknesses of individual tracking technolo-gies. For example, Foxlin [13] presented an orientation trackingmethod that combines inertial tracking andmagnetic tracking. In theirmethod, three orthogonal gyroscopes are incorporated to obtain threeorientation measurements, and inclinometers and a compass are usedto compensate for gyro drift. State et al. [28] developed a trackingsystem that integrates the accuracy of vision-based tracking with therobustness of magnetic tracking. In their system, video tracking ofcolor-coded landmarks acts as the primary method for determiningcamera position and orientation, but if the image analyzer cannotlocate enough landmarks, the magnetic tracker acts as the primarytracker. These tracking systems, however, cover limited ranges.

Some hybrid tracking systems have been developed for outdoorenvironments; for example, You et al. [34] built an outdoor orientationtracking system that integrates inertial and vision-based technologies.In their system, the inertial tracking estimates the approximate 2Dfeature-motion, and then the vision feature tracking corrects andrefines the estimate in the image domain. Finally, the system convertsthe estimated 2D-motion residual to a 3D-orientation correction forthe gyroscope. Behringer [5] presented a tracking approach thatcombines Global Positioning System (GPS) tracking and vision-basedtracking. In their system, GPS tracks the observer location, and the360° silhouette is computed, based on the tracked location, from adigital elevation map database. The registration is achieved bymatching this predicted silhouette and the extracted visual horizontalsilhouette from a video image. Azuma et al. [2] developed a trackingsystem that combines GPS tracking, inertial tracking, and magnetictracking. In their system, GPS tracking directly provides the position,

and measurements from three gyros, a compass, and a tilt orientationsensor are fused to provide orientation. Thomas et al. [30] presented atracking system that integrates GPS tracking, magnetic tracking, andvision-based tracking. In their system, vision-based tracking of fiducialmarkers is primarily used for tracking orientation and position, but if aconfidence value for registering a fiducial marker goes below athreshold, GPS and a compass are used for tracking position andorientation, respectively. However, the errors with these outdoortracking systems are too large to achieve compelling registration.

There also have been studies for scalable tracking systems; forexample, Foxlin et al. [12] presented a scalable tracking system,Constellation™, that combines inertial tracking and acoustic tracking.The Constellation™ system uses an integrated inertial sensinginstrument, ultrasonic rangefinder modules, and transponder beaconsmounted in the environment. The rangefinder modules emit infraredtrigger codes that activate the beacons one-at-a-time, and eachbeacon responds to its own unique code by emitting an ultrasonicpulse. The rangefinder modules count the time-of-flight of the pulseand converts that measurement into a distance, which are used tomake adjustments to the position and orientation trajectory updatedat a high rate by inertial tracking. Welch et al. [32] developed ascalable electro-optical tracking system, HiBall Tracker, which uses asensing unit called HiBall and infrared light-emitting-diodes (LEDs)packaged in square ceiling panels. The extra views provide a benefitfor steady-state tracking by effectively increasing the overall HiBallfield of view without sacrificing the optical sensor resolution. Theposition and orientation of the HiBall are computed based on eachindividual LED sighting made by the HiBall. Wormell et al. [33]presented an inertia-optical tracking system, IS-1200 VisTracker,which fuses data from inertial and vision sensors, using passivefiducials for its optical tracking. The tracking area in Constellation™,HiBall Tracker, and IS-1200 VisTracker can be expanded by adding thetransponder beacons, the ceiling panels, and the passive fiducials,respectively. These scalable tracking systems achieve high accuracywhile covering large ranges.

Practically, the design of a tracking system for constructionapplications should account for the nature of specific constructioncharacteristics. Although the scalable tracking systems mentionedabove offer potential for deployment to a construction site, theirfeasibility for construction applications has not well been studied,which entails the following issues. First, these tracking systems arecapable of providing scalable networking to a certain range of limiteddomain, but their expandability to the large-scale application domain,such as a construction site, has not been verified. Second, thesetracking systems did not originate from the resource-and power-constraint environment so extensive installation and configurationefforts may be required due to the high unit cost and size of thesystem. Third, the networking protocol deployed in their specifictracking systems cannot provide a wide range of interoperabilityamong other types of devices. The interoperability characteristic is ofutmost importance for heterogeneous communications amongdiverse participants involved in a construction project. Thus, seamlessinformation transfer and access should be available throughout thenetworks configured for a construction site.

3. AR for construction sites

A construction site is characterized as expansive in nature (i.e.,broad with respect to size or area) and also as occluded because theviews on a site are often obstructed as structures or elements are builtup. These characteristics of a construction site indicate the necessity todevelop AR tracking systems for large-scale and occluded spaces. Theaccuracy of work tasks also should be considered on constructionsites. Shin et al. [24] investigated the required accuracy of the worktasks in construction. Their study shows that work tasks in construc-tion require high or somewhat high accuracy (0-approximately

Page 3: Utilization of ubiquitous computing for construction AR technology

1065D. Shin, W.-S. Jang / Automation in Construction 18 (2009) 1063–1069

50 mm) depending on the types of work tasks. Another characteristicof a construction site is mobility, generally requiring that workers aremobile in their work areas as well as over the entire construction site.This indicates that AR systems for construction sites should be highlymobile, which means they need to be small, lightweight, and wireless.

Table 1 shows the main tracking systems and their adaptability forthe typical characteristics of a construction site, as mentioned inSection 2. The categories that each kind of tracker satisfies forconstruction AR are shaded gray. Mechanical, magnetic, acoustic, andoptical trackers cover a small tracking range and cannot track a userbehind occlusions. Mechanical trackers physically connect a user sotheir mobility is very low. Magnetic, acoustic, and optical trackers canprovide good mobility as most of their tracking sensors are small andlightweight; however, they still need to be wireless for high mobility.Inertial trackers do not depend on targets in a tracking range so theyare highly mobile and cover a large tracking range without occlusionproblems. Their accuracy, however, is usually very low. GPS trackerscover a large tracking range and are very mobile, but they cannot beused behind occlusions and their accuracy is very low. Hybrid trackersfor outdoor use have been studied to address these problems of GPStrackers, but their accuracy is still low. Scalable trackers cover a largetracking range while maintaining a good accuracy. They also addressthe occlusion problems by installing tracking targets in a manner toachieve a clear line-of-sight in any locations on a construction site. Thetracking sensors for scalable trackers are small and lightweight, butstill need to be wireless for high mobility.

As shown in Table 1, scalable trackers are the most appropriate fora construction site compared to other types of trackers. The features ofscalable trackers that satisfy the characteristics of a construction site,however, are achieved from a prerequisite that tracking targets areinstalled appropriately. To produce good accuracy, the tracking targetsneed to be strategically placed at predetermined positions. Placing thetracking targets may not be easy, requiring both cost and time, whichleads to the question of how to compensate for this cost and time.Maximizing the benefits from the AR for the work tasks and sharingthe tracking targets for other purposes may be the answer. Thepotential benefits of AR in construction have been suggested ordemonstrated (e.g., [10,22,25,26]). These other purposes may be forautomatic supply chain management, labor/safety monitoring, equip-ment/resource tracking, construction materials management, con-struction automation, etc.

The idea of trackingworkers, materials, or equipment anywhere ona construction site with scalable trackers is similar to the concept ofhaving a ubiquitous environment where people can access necessarydigital information anywhere with networked sensors. This similarityindicates that AR systems may be constructed based on ubiquitousenvironments. If an ubiquitous system on a construction site canconduct tracking, it might significantly improve mobility. That is, ifubiquitous system presents construction information on the mobiledisplay directly from the server without the use of a mobile orwearable computer, it would increase the mobility of workers.Therefore, ubiquitous AR environments are expected to be the bestsuitable for a construction site. The next question is how to establish

Table 1Various tracking systems and their adaptability to the typical characteristics of aconstruction site.

ubiquitous AR environment. Such a system may consist of three maincomponents: (1) display, (2) tracker, and (3) server. The followingsections discuss the components in detail.

4. AR displays

The display devices for AR applications are generally categorized ashead-mounted and non-head-mounted displays. Head-mounted dis-plays (HMDs) have been used mainly for AR applications since the1960s. HMDs are generally grouped into two types of “see-through”systems: optical see-through and video see-through [14].

Optical see-through systems place optical combiners in front of theuser's eyes and provide the AR overlay through them [1]. Thesecombiners are partially transmissive and partially reflective so that theuser can look directly through them to see the real world and can alsosee the virtual images reflected in them at the same time. Video see-through systems provide the AR overlay by combining the virtualimages generated by the computer with the images of the real worldcoming from a head-mounted video camera. The AR overlay isproduced at the video compositor and then is sent to the monitors,where the user can see the AR overlay. There are strong points andshortcomings for each of the optical and video see-through HMDs.Optical blending of optical see-through HMDs is simpler and cheaperthan video blending of video see-through HMDs. Optical approachesneed a video stream only for virtual images while video approachesneed separate video streams for the real and virtual images. Opticalapproaches provide a full resolution of the real world and a wideperipheral vision. In video approaches, virtual objects can fullyobscure physical objects and therefore can be rendered clearly. Inoptical approaches, however, the overlaid graphics cannot completelyobscure the physical objects behind them, thus the illusion of reality iscompromised. Video blending provides digitized images of the realscene, and these digitized images allow video approaches to employadditional registration strategies unavailable to optical approaches.Optical see-through HMDs have to be recalibrated whenever they aretaken off and put on, as well as after each relative movement betweenthe optical display and the user's eyes. However, current video see-through HMDs have critical defects for use for construction sites.Video approaches provide a limited peripheral vision, thus allowing alimited field of view of the environment. In addition, a field of view ofthe camera that may be different from that of human eyes can cause adistorted sense of space, which might negatively influence theworker's ability to correctly assess the proximity of job site hazards.These limitations of video approaches may place workers workingwearing HMDs walking around the construction site in danger.Considering all the aspects of HMDs, optical see-through HMDs arecurrently more appropriate than video see-through HMDs for aconstruction site.

Non-head-mounted displays do not allow users to use their handsfreely because users need to hold or move the displays with theirhands. However, there are potential uses for handheld or tripod-mounted displays, particularly for layout or inspection. Handheld ortripod-mounted displays provide a higher resolution for a conven-tional monitor, and the user does not experience the fatigue anddiscomfort that can be caused from wearing an HMD. Handheld ortripod-mounted displays also provide full peripheral vision. Motiontracking technology developer InterSense, Inc. (http://www.isense.com), for example, has demonstrated an AR application thatincorporates a handheld computer monitor as the display device. Itwould be straightforward to apply the same type of technology toinspection tasks in construction where position is not the qualitycriteria. Shin et al. [23] made use of a tripod-mounted display toinspect the positions of steel columns. By using the stationary display,the dynamic error in this high-accuracy task was minimized. Theirdisplays are based on video blending. In their systems, a video camerais mounted on the back of the display to provide a see-through effect.

Page 4: Utilization of ubiquitous computing for construction AR technology

Fig. 2. Schematic architecture of video-based U-AR.

1066 D. Shin, W.-S. Jang / Automation in Construction 18 (2009) 1063–1069

5. Schematic Architectures of Ubiquitous AR Environment

The conceptual system of a ubiquitous AR environment that thisstudy suggests is titled “U-AR.” For a ubiquitous environment, U-AR isdesigned to have tracking targets that work through distributednetworks. With networked tracking targets, not only are the positionand orientation of the sensor tracked, but also construction informa-tion corresponding to tracking information is delivered to the display.Figs. 1 and 2 show the schematic architectures of U-AR. The maincomponents of the proposed architectures are display, tracker, andserver. These architectures were derived from the investigation of thecharacteristics of construction sites in terms of technology suitabilityof tracker and display as explained above. U-AR may be classified intooptical-based and video-based according to the type of display.Optical-based U-AR makes use of optical see-though HMDs whilevideo-based U-AR employs video see-through HMDs or handheld ortripod-mounted displays. The difference in displays makes thedifference in the types of files delivered through networks as shownin Figs. 1 and 2.

In optical-based U-AR, a user directly sees the real world scenethrough an optical combiner. He also sees virtual images reflected onthe optical combiner, thus having combined images of the real worldscene and virtual objects through the combiner. This approach frees aserver from dealing with the real world image. Meanwhile, a serverneeds to manipulate virtual images according to the changingviewpoint of the user in real time and send it to the display throughtracking targets. The delivered virtual images are projected onto theoptical combiner by a small projector built in the display. To track theuser's viewpoint in real time, the sensor mounted on the HMD or theuser's head is tracked based on the tracking targets.

In video-based U-AR, a video camera captures the live images of thereal world scene and sends it to a server through the tracking targets.The server manipulates the virtual images according to the changingviewpoint of the user and then combines them with the live videostream of the real world scene to produce the AR images, which aresent to the display through the tracking targets. Tracking targets forvideo-basedU-ARneed todelivermore information compared to thosefor optical-based AR, so they require more capacity for informationtransmission.

The architectures present a direction of what technologiesregarding display, tracker, and server should be focused for thedevelopment of the compelling U-AR which is expected to be the bestsuitable to provide AR-based visual information for constriction inconstruction sites.

6. Tracking System for U-AR

Current wireless sensor networks (WSNs) aim at the implementa-tion of distribution and wireless connectivity through a highly dynamicand complex environment, such as a construction site, which char-acterizes a common sensor network interface that eases data aggrega-tion and profiling through a wide range of sensor types. The sensorinterface should account for interoperability with various platformstailored by the application requirements. However, traditional network

Fig. 1. Schematic architecture of optical-based U-AR.

abstractions for wireless systems are generally not suitable forcommunication interoperability in construction. For example, itemssuch as SmartPhones [7], GPS [11], or laptops with IEEE 802.11 (e.g. ,[6,19,20])) cost hundreds of dollars, targeting specialized applications,and rely on the pre-deployment of extensive infrastructure support.Thus, the design for a tracking system should consider a higher level ofnetworking flexibility and expandability in a large-scale constructiondomain while ensuring low-cost implementation and reliable perfor-mance. At the same time, increased interoperability among diversesystem platforms should be taken into account with minimumefforts for configuring the infrastructure needed for installation andmaintenance.

IEEE802.15.4 is a recent wireless communication standard thatprovides both flexibility and interoperability for industrial applica-tions, such as building and construction automation, structural healthmonitoring, and automated control and operation. The main motiva-tion of this standard is to enhance energy efficiency throughout theheterogeneous networking so as to prolong battery lifetime and tolower the maintenance cost for a ubiquitous computing environment[16]. Currently, industrial applications lead the way to marketadoption together with home automation; and once high mass-volumewithmature technology and low device cost is achieved, manyother market segments will benefit from WSN. It is expected that theIEEE802.15.4 should start finding significant adoption in manybuilding and construction applications in 2008. With the demandsof WSNs in other industries, many vendors have already introducedwireless sensor products based on IEEE802.15.4, and the overall WSNmarket could hit the one billion dollar level as soon as 2009–2010 [8].

By taking advantage of the IEEE802.15.4 standard, preliminaryresearch has been conducted to present a tracking methodology usinga combination of ultrasound (US) and radio frequency (RF) to increasepositioning accuracy and performance [17]. An ultrasound signal isused as a means of measuring a distance between nodes since ittravels at a slow enough rate for a device to detect the first arrivalsignal in a time-of-flight manner. Detecting the first arrival of a signalprevents possible multipath interference by admitting the earliesttimestamp, while discarding the subsequent signals arriving in thereceiver. Fig. 3 shows the schematic illustration where signalprocesses at the physical (PHY) and media access control (MAC)layers implement the timestamping scheme to determine a ranging

Fig. 3. Illustration of timestamping scheme for Time-of-Flight (TOF) approach.

Page 5: Utilization of ubiquitous computing for construction AR technology

Fig. 5. Measurement errors and standard deviation errors.

1067D. Shin, W.-S. Jang / Automation in Construction 18 (2009) 1063–1069

distance. In this architecture, the RF signal at a beacon serves as atrigger to force the ultrasound pulses emitted from a remote node.Then, the triggered ultrasound travels back to the beacon node, andthe ranging distance is calculated by converting the one-way traveltime of ultrasound using four timestamp measurements (TT1, TT2,TR1, and TR2). Finally, trilateration is employed to calculate theunknown position of a remote node by collecting each rangingdistance measured at three beacons.

The signal detection scheme is achieved by sequential order ofsignal processing in the beacon node and is followed by the RFtriggering the tasks controlled by the periodic timer operationassigned by the user. This means that the analog-to-digital converter(ADC) operation is handled by a RF trigger schedule, such thatwhenever the RF trigger signal is driven by a timer, the ADC isactivated and is ready to detect the received ultrasound signal. TheADC operation is then interrupted by a hardware handler right afterthe ADC issues the reception of the first arrival signal and powers offthe ADC port by returning a FAIL flag to the ADC controller. Thereceived signal is represented as voltage values that are converted bythe ADC. The ultrasound signal transmitted from a remote node has a40 KHz sound pulse wave that is amplified by three layers ofoperational amplifier (OP AMP), and then the ADC detects the analogsignal of the pulse wave, converting it to digital format. The values ofthe converted signal depend on the mechanism of the ADC bit-ratethat determines the resolution. The preliminary results showed thatSNR measured at 1m distance between a beacon and a remote nodeindicates 17 dB, while 1.8 dB is measured at a 14m distance (Fig. 4).Distance errors measured at each Tx–Rx separation point indicated5.19 cm on average (Fig. 5).

In the trilateration approach, increasing the SNR is especiallyimportant to achieve long-range positioning, which is an additionalcontributor to a scalable networking framework. A lower SNR resultsin a lower detection rate for the receiving ultrasound signal at the ADCbecause the ADC at the given threshold, with a lower SNR, may beconfused as to which one is a received signal or a noise. One simple,easy solution to increase the detection rate is increasing the powersupply voltage at the transmitter. A higher voltage, thus an increasedSNR, can be detected by the receiver at 15 m by increasing the powersupply at the transmitter.

In order to increase the detection rate, a single tone detectionscheme also can be deployed. Usually, detection of sinusoidal signalsfrom a finite number of noisy discrete-time signals, such as a 40 KHzultrasound wave, can be resolved by a probabilistic approach. Theperiodogram is a way to conduct an efficient spectrum analysis byusing fast Fourier transform (FFT), and many forms of periodogram

Fig. 4. Plotting the peak voltage level of received signal and noise (left axis) and signal-to-noise ratio (SNR, right axis) in decibel.

have been approached to provide better detection performance for allsignal-to-noise ratio conditions [27].

In addition to signal detection scheme, the non-line-of-sight(NLOS) of the ultrasound signal is very likely to be a practicallimitation of the proposed motion tracking technique in a typicalconstruction environment where temporary facilities, building struc-tures, and heavy equipment narrow down the possible applications.As a solution, the redundancy of measurements afforded by multipleactive nodes throughout the distributed network can compensate forlocalization ambiguity and NLOS limitations. In general, redundancyand high node density are the key positive characteristics ofdistributed wireless communication over traditional networks. Adensely deployed network provides a sufficient amount of dataredundancy, which directly impacts the localization accuracy and thecommunication cost. In addition to the redundancy, the RF signal as areference comparator can be employed to add the certainty of thelocalization performance when the ultrasound signals are obstructed.By deploying the received signal strength index (RSSI) and link qualityindex (LQI), their correlations with the ranging distance will beanalyzed to provide the supplemental measures for the increasedreliability under the NLOS circumstances.

As explained thus far, WSNs with IEEE802.15.4 show high potentialas a position tracking system for U-AR. However, the orientationtracking based on trilaterationwithWSNsmaynot be reliable for an ARsystem becauseWSNs are not highly accurate enough to produce veryaccurate orientation. An AR system is usually more sensitive toorientation tracking than to position tracking [26]. To address thisissue, an inertial tracker is considered and combinedwith the positiontracking device.Microelectromechanical systems (MEMS) based on aninertial device, such as the ADIS16365 (www.analog.com), provide apotential choice formeasuring the orientation of the tracker with low-cost and low-power implementation. A 10-bit ADC with a referencevoltage of 2.54 V at an ATmega128 microcontroller has enoughsensitivity of approximately 2.5 mV for external inertial sensors thatoperate from a 4.75 V to 5.25 V power supply at a consumption rate of24–49 mA with a dynamic range of ±300°/s and minimum digitaloutputs of 400 mV. Orientation data measured at an external inertialsensor are processed at an ATmega128 and then allocated to an RFpayload through a CC2420 radio chip, together with the position datameasured at a position tracker. As a consequence, a low-cost and low-power position and orientation tracker under an 802.15.4 networkingframework couldminimize the efforts of configuring the infrastructureat a high level of motion tracking accuracy.

7. Server for U-AR

The system flows of servers for a U-AR may be grouped into twoprimary concepts according to the types of U-AR (optical-based and

Page 6: Utilization of ubiquitous computing for construction AR technology

Fig. 6. Conceptual scenario of mesh network for achieving U-AR on a construction site.

1068 D. Shin, W.-S. Jang / Automation in Construction 18 (2009) 1063–1069

video-based). The server for a video-based U-AR finds each user's roleas he selects it and reads customized information for his role from thesystem's database. The database includes 3D drawings, specifications,each user's id and role, etc. The server for a video-based U-AR receivesthe tracking information for each user's viewpoint and live scenevideo through tracking targets, manipulates the customized informa-tion according to the tracking information, and combines it with thereal world scene in real time. Then, the server of a video-based U-ARsends the combined image to a display through tracking targets.

A server for an optical-based U-AR customizes information for eachuser from the system's database in the same way as a server of video-based U-AR. However, a server of an optical-based U-AR receives onlytracking information for each user's viewpoint through the trackingtargets because an optical-based U-AR does not make use of live scenevideo. A server of an optical-based U-AR manipulates the customizedinformation according to the tracking information, and then sends itto a display through tracking targets.

U-AR servers also execute a task for data communication throughmesh networking. Multiple beacons are placed at predefined locationsaccording to topological configuration and individual communicationranges on a construction site. A motion tracker equipped with aposition and orientation sensor transmits data through the network inan ad-hoc manner, and the customized information at the servers isreturned back to the motion tracker following up the display tasks.Fig. 6 shows a conceptual scenario of a mesh network, where highmobility and expandability of distributed sensor nodes can beachieved by the utilization of IEEE802.15.4 compliant devices.

8. Conclusions

Based on the review of the main AR tracking techniques and thecharacteristics of a construction site, scalable trackers were found tobe the most appropriate for construction sites. From the similaritybetween AR systems with scalable trackers and ubiquitous environ-ments with networked sensors, a ubiquitous AR environment, “U-AR,”is presented. U-AR is intended to enhance accessibility to thedistributed networks that provide a gateway between a physicalconstruction site and the digital information of AR. The proposedarchitectures of U-AR present a direction of what technologiesregarding display, tracker, and server should be focused for thedevelopment of the compelling U-AR which is expected to be the bestsuitable to provide AR-based visual information for constriction sites.

The three main components (display, tracker, and server) of U-ARwere investigated as follows:

• Display: Optical-based HMDs are currently more appropriate thanvideo-based HMDs for construction sites because of safety issues.However, video-basedhandheld or tripod-mounteddisplays alsohavethe potential for some work tasks, such as layout or inspection.

• Tracker: The potential of a low-cost and low-power tracking systemfor U-AR was introduced which combines ultrasonic waves withIEEE802.15.4 compliant radio signals for position and employs aMEMS-based inertial sensor for orientation.

• Server: The system flows of servers for a U-AR may be grouped intotwo primary concepts according to the type of U-AR (optical-basedand video-based). While the sever of an optical-based U-AR dealswith virtual images, a video-based U-AR handles both virtual imagesand real images.

The implementation framework also was examined as to how theubiquitous AR technology can reside in the distributed networksconfigured for a large-scale construction site. This framework includesthe combination technique of ultrasonic device with IEEE802.15.4protocol, the mesh networks of wireless sensors, and the designarchitecture of server and data communication. According to thepreliminary experiment, scalable tracker supported by WSN technol-ogy for U-AR provides a promising accuracy of tracking performancewith increased network expandability. Consequently, it is expectedthat the proposed U-AR architecture can be the best suitable toprovide AR-based visual information in construction sites. Futureresearch will perform a prototype development and testbed imple-mentation that will envision the practical deployment of theubiquitous AR technology at a real construction site.

Acknowledgement

This research was supported by the Yeungnam University researchgrants in 2009.

References

[1] R. Azuma, A survey of augmented reality, Presence: Teleoperators and VirtualEnvironments 6 (4) (1997) 355–385.

[2] R. Azuma, B. Hoff, H. Neely III, R. Sarfaty, A motion stabilized outdoor augmentedreality system, Proc. Virtual Reality Annual International Symposium (VR99),Huston, Texas, U.S.A, March 13–17 1999, pp. 252–259.

Page 7: Utilization of ubiquitous computing for construction AR technology

1069D. Shin, W.-S. Jang / Automation in Construction 18 (2009) 1063–1069

[3] R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, B. MacIntyre, Recent advancesin augmented reality, IEEE Computer Graphics and Applications 21 (6) (2001)34–47.

[4] A.H. Bahzadan, Z. Aziz, C.J. Anumba, V.R. Kamat, Ubiquitous location tracking forcontext-specific information delivery on construction sites, Automation inConstruction 17 (6) (2008) 737–748.

[5] R. Behringer, Registration for outdoor augmented reality applications using computervision techniques and hybrid sensors, Proc. Virtual Reality Annual InternationalSymposium (VR'99), Huston, Texas, U.S.A, March 13–17 1999, pp. 244–251.

[6] A. Behzadan, V.R. Kamat, Visualization of construction graphics in outdooraugmented reality, Proc. The 2005 Winter Simulation Conference, Orlando,Florida, U.S.A, 2005, pp. 1914–1920.

[7] S. Bowden, A. Dorr, T. Thorpe, C.J. Anumba, Mobile ICT support for constructionprocess improvement, Automation in Construction 15 (5) (2006) 664–676.

[8] L. Chalard, D. Helal, L. Verbaere, A. Welig, J. Zory, Wireless sensor networksdevices: overview, issues, state-of-the-art and promising technologies, ST Journalof Research, STMicroelectronics 4 (1) (2007) 4–18.

[9] P.S. Dunston, X. Wang, M. Billinghurst, B. Hampson, Mixed reality benefits fordesign perception, Proc. The 19th International Symp8osium on Automation andRobotics in Construction (ISARC 2002), Washington, D.C., U.S.A, 2002, pp.191–196.

[10] S. El-Tawil, V.R. Kamat, Rapid reconnaissance of post-disaster building damageusing augmented situational visualization, Proc. 17th Analysis and ComputationSpecialty Conference, St. Louis, Missouri, U.S.A, 2006, pp. 1–10.

[11] E. Ergen, B. Akinci, R. Sacks, Tracking and locating components in a precast storageyard utilizing radio frequency identification technology and GPS, Automation inConstruction 16 (3) (2007) 354–367.

[12] E. Foxlin, M. Harrington, G. Pfeifer, Constellation™: a wide-range wirelessmotion-tracking system for augmented reality and virtual set applications. Proc.The ACM SIGGRAPH Conference on Computer Graphics, Orlando, Florida, U.S.A,1998, pp. 371–378.

[13] E. Foxlin, Inertial head-tracker sensor fusion by a complementary separate-biasKalman filter, Proc. Virtual Reality Annual International. Symposium (VRAIS'96),Washington, D.C., U.S.A, 1996, pp. 185–194.

[14] H. Fuchs, J. Ackerman, Displays for augmented reality: historical remarks andfuture prospects, in: Mixed Reality: Merging Real and Virtual Worlds, Yuichi Ohtaand Hideyuki Tamura (ed.), Chapter 2, Ohmsha Ltd and Springer-Verlag,Heidelberg, Germany, 1999, pp. 31–40.

[15] J. Isdale, Motion tracking, VR News, 1999, online available at http://vr.isdale.com/vrTechReviews/MotionTracker_Oct1999.htm, accessed in February 2005.

[16] W.-S. Jang, M.J. Skibniewski, Strategy for applying ubiquitous computing andsensor networks to surveillance of civil infrastructure systems, GórnictwoOdkrywkowe (Journal of Surface Mining) 5–6 (2006) 173–177.

[17] W.-S. Jang, Embedded system for construction material tracking using combina-tion of radio frequency and ultrasound signal”, Ph.D. Thesis, Department of Civiland Environmental Engineering, University of Maryland, College Park, MD, U.S.A.,2007.

[18] C. Krautz. Tracking — overview and mathematics, JASS 2004 Course Slides, 2004,online available at http://wwwbruegge.in.tum.de/pub/DWARF/JASS2004Ubitrack-Old/krautz.ppt, accessed in February 2005.

[19] S.-W. Leung, S. Mak, B.L.P. Lee, Using a real-time integrated communication systemto monitor the progress and quality of construction works, Automation inConstruction 17 (6) (2008) 749–757.

[20] S. Nuntasunti, L.E. Bernold, Experimental assessment of wireless constructiontechnology, ASCE Journal of Construction Engineering and Management 132 (9)(2006) 1009–1018.

[21] W. Piekarski, B. Thomas, Tinmith-metro: new outdoor techniques for creating citymodels with an augmented reality wearable computer.”, Proc. InternationalSymposium on Wearable Computers (ISWC 2001, Zurich, Swiss, 2001, pp. 31–38.

[22] G. Roberts, A. Evans, A. Dodson, B. Denby, S. Cooper, R. Hollands, The use ofaugmented reality, GPS and INS for subsurface data visualisation. Proc. FIG XXIIInternational Congress, TS5.13 Integration of Techniques, Washington, D.C., U.S.A,2002.

[23] D. Shin, Strategic development of AR systems for industrial construction, Ph.D. Thesis,Department of Civil Engineering, Purdue University, West Lafayette, IN, U.S.A., 2007.

[24] D. Shin, W. Jung, P.S. Dunston, Camera constraint on multi-range calibration ofaugmented reality systems for construction sites, Journal of InformationTechnology in Construction 13 (2008) 521–535.

[25] D. Shin, P.S. Dunston, Identification of application areas for augmented reality inindustrial construction based on the suitability, Automation in Construction 17 (7)(2008) 882–894.

[26] D. Shin, P.S. Dunston, Evaluation of augmented reality in steel column inspection,Automation in Construction 18 (2) (2009) 118–129.

[27] H.C. So, Y.T. Chan, Q. Ma, P.C. Ching, Comparison of various periodograms for singletone detection and frequency estimation, , in: Proc. IEEE International Symposiumon Circuits and Systems, Hong Kong, 1997 pp. 2529–2532.

[28] A. State, G. Hirota, D. Chen, W. Garrett, M. Livingston, Superior augmented realityregistration by integrating landmark tracking and magnetic tracking, Proc.SIGGRAPH'96, ACM Press, New Orleans, Louisiana, U.S.A, 1996, pp. 429–438.

[29] B. Thomas, W. Piekarski, B. Gunther, Using augmented reality to visualisearchitecture design in an outdoor environment.” In Design Computing on theNet, 1999, online available at http://www.tinmith.net/papers/thomas-dcnet-1999.pdf, accessed in March 2005.

[30] B. Thomas, B. Close, J. Donoghue, J. Squires, P. Bondi, M. Morris, W. Piekarski,ARQuake: an outdoor/indoor augmented reality first person application, Proc. TheFourth International Symposium on Wearable Computers (ISWC'00, IEEEComputer Society, Atlanta, GA, U.S.A, 2000, pp. 139–146.

[31] A. Webster, S. Feiner, B. MacIntyre, W. Massie, T. Krueger, Augmented reality inarchitectural construction, inspection, and renovation, Proc. ASCE Third Congresson Computing in Civil Engineering, Anaheim, CA, 1996, pp. 17–19.

[32] G. Welch, G. Bishop, L. Vicci, S. Brumback, K. Keller, High-performance wide-areaoptical tracking: the HiBall tracking system, Presence: Teleoperators and VirtualEnvironment, The MIT Press 10 (1) (2001) 1–21.

[33] D. Wormell, E. Foxlin, P. Katzman, Advanced inertial-optical tracking system forwide area mixed and augmented reality systems.”, Proc. 10th InternationalImmersive Projection Technologies Workshop (IPT)/13th Eurographics Workshopon Virtual Environments (EGVE), Weimar, Germany, 2007.

[34] S. You, U. Neumann, R. Azuma, Orientation tracking for outdoor augmented realityregistration, IEEE Computer Graphics and Applications 19 (6) (1999) 36–42.