calibration-free projector-camera system for spatial ...fdesorbi/missbiblio/ndsms12/ndsms12.pdf ·...

4
Calibration-Free Projector-Camera System for Spatial Augmented Reality on Planar Surfaces Takayuki Nakamura, Franc ¸ois de Sorbier, Sandy Martedi, Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {takayuki,fdesorbi,sandy,saito}@hvrl.ics.keio.ac.jp Abstract Spatial augmented reality extends augmented real- ity by projecting virtual data directly on a target sur- face, but requires to calibrate a projector-camera sys- tem. This paper introduces a free-calibration projector- camera system for spatial augmented reality with a pla- nar surface. A pattern is projected on the target surface that can be freely moved. The main difficulty is to make the projected image fitting the target surface. This is of- ten achieved by calibrating the projector according to the camera’s pose, but the calibration process is offline and restricts the configuration of the setup. Our pro- posed method allows to independently move the projec- tor, the camera or the surface while the projected image continuously fit the target surface. The experimental re- sults demonstrate the efficiency of our calibration-free spatial augmented reality approach when applied on a moving planar surface. 1. Introduction Augmented Reality (AR) augments the real world with virtual information. This process often mixes data generated on a computer with a live video stream. A tar- get surface is detected and tracked in order to correctly display a virtual object onto the target surface. Thanks to the latest researches and development advances, it is now possible to add a projector that directly displays the virtual information. It is called Spatial Augmented Re- ality (SAR) [6] and is practical because users no longer need to carry a visualization device. The main diffi- culty of this approach is to modify the projected image to make it fit the target surface. It is often achieved by calibrating the projector-camera system which is offline and restricts the configuration of the setup. The goal of our research is to project a texture on a moving textured planar surface with an uncalibrated projector-camera system like in Fig 1. The target planar surface can be freely moved while the projected image continuously fits the surface even if the projector or the camera are shifted. It means that the surface and the projected image have to be tracked. However, the pro- jection hinders the tracking, since it badly modifies the appearance of the surface and prevents the use of stan- dard trackers based on feature detection. We propose to address this problem by adapting the Lucas-Kanade algorithm, a well known planar surface tracking method that minimizes the difference between a captured image and a target image. Our algorithm can simultaneously track both a planar surface texture and a projection texture. We then warp back both textures to the template coordinates and minimize the sum of the difference between each warped textures and the template. The target surface, the camera and the projector can then be freely moved or their parameters can be modified since no calibration is required. Figure 1. Goal of our method. After introducing some related works, we will ex- plain our approach in Section 3. Finally, we present our experimental results in Section 4 with an evaluation of the accuracy of our method.

Upload: others

Post on 21-Aug-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Calibration-Free Projector-Camera System for Spatial ...fdesorbi/missbiblio/NdSMS12/NdSMS12.pdf · {takayuki,fdesorbi,sandy,saito}@hvrl.ics.keio.ac.jp Abstract Spatial augmented reality

Calibration-Free Projector-Camera System for Spatial Augmented Realityon Planar Surfaces

Takayuki Nakamura, Francois de Sorbier, Sandy Martedi, Hideo SaitoGraduate School of Science and Technology, Keio University, Yokohama, Japan

{takayuki,fdesorbi,sandy,saito}@hvrl.ics.keio.ac.jp

Abstract

Spatial augmented reality extends augmented real-ity by projecting virtual data directly on a target sur-face, but requires to calibrate a projector-camera sys-tem. This paper introduces a free-calibration projector-camera system for spatial augmented reality with a pla-nar surface. A pattern is projected on the target surfacethat can be freely moved. The main difficulty is to makethe projected image fitting the target surface. This is of-ten achieved by calibrating the projector according tothe camera’s pose, but the calibration process is offlineand restricts the configuration of the setup. Our pro-posed method allows to independently move the projec-tor, the camera or the surface while the projected imagecontinuously fit the target surface. The experimental re-sults demonstrate the efficiency of our calibration-freespatial augmented reality approach when applied on amoving planar surface.

1. Introduction

Augmented Reality (AR) augments the real worldwith virtual information. This process often mixes datagenerated on a computer with a live video stream. A tar-get surface is detected and tracked in order to correctlydisplay a virtual object onto the target surface. Thanksto the latest researches and development advances, it isnow possible to add a projector that directly displays thevirtual information. It is called Spatial Augmented Re-ality (SAR) [6] and is practical because users no longerneed to carry a visualization device. The main diffi-culty of this approach is to modify the projected imageto make it fit the target surface. It is often achieved bycalibrating the projector-camera system which is offlineand restricts the configuration of the setup.

The goal of our research is to project a texture ona moving textured planar surface with an uncalibratedprojector-camera system like in Fig 1. The target planar

surface can be freely moved while the projected imagecontinuously fits the surface even if the projector or thecamera are shifted. It means that the surface and theprojected image have to be tracked. However, the pro-jection hinders the tracking, since it badly modifies theappearance of the surface and prevents the use of stan-dard trackers based on feature detection.

We propose to address this problem by adapting theLucas-Kanade algorithm, a well known planar surfacetracking method that minimizes the difference betweena captured image and a target image. Our algorithm cansimultaneously track both a planar surface texture anda projection texture. We then warp back both texturesto the template coordinates and minimize the sumof the difference between each warped textures andthe template. The target surface, the camera and theprojector can then be freely moved or their parameterscan be modified since no calibration is required.

Figure 1. Goal of our method.

After introducing some related works, we will ex-plain our approach in Section 3. Finally, we present ourexperimental results in Section 4 with an evaluation ofthe accuracy of our method.

Page 2: Calibration-Free Projector-Camera System for Spatial ...fdesorbi/missbiblio/NdSMS12/NdSMS12.pdf · {takayuki,fdesorbi,sandy,saito}@hvrl.ics.keio.ac.jp Abstract Spatial augmented reality

Figure 2. Overview of our setup.

2. Related works

In spatial augmented reality, the projected imageneeds to be transformed for fitting the target surfacein real-time. Besides the tracking of the moving tar-get object, we need to modify the projected image.However, common tracking techniques based on fea-ture points [4, 11, 9] or gradient [5, 2] cannot be appliedbecause of the lighting environment induced by the pro-jector that hinders the features points.

Several solutions are focusing on imperceptibly em-bed markers like [7, 14] or sensing devices [3, 12]. Butthey reduce the dynamic range of the projector or re-quire to set sensors on the object.

[13, 8] proposed to add physical markers to trackthe surface, but the pattern should not be projected ontothese markers. Leung et al. [10] proposed to track theborders of the planar surface.

Audet et al. [1] introduced a directly alignmentmethod of a projection image with a textured surfaceby minimizing the error function of a gradient basedtracking.

However, all these methods use a calibratedprojector-camera pair and implies that they might failif components of the system are separately moved or ifits parameters like the focal length are changed.

3. Our Method

In this section, we describe the theory of outprojector-based Lucas-Kanade method.

3.1. The geometric Model

Our projector-based Lucas-Kanade algorithm de-fines two homographies Hs and Hp for warping the sur-face texture and the projector texture. These warpingfunctions are respectively defined as W (x; s),W (x; p)where s and p are homography’s parameters and x =(x, y) is the coordinate of the image. The geometric

model that defines the relation between the templatesTs and Tp, and the input image I warped back onto thetemplate’s coordinate system can be written as follows:

Ts(x) = I(W (x; s)), (1)

Tp(x) = I(W (x; p)). (2)

Also, in terms of coordinates, Eq.(1) and Eq.(2) showthat a pixel xc from the camera image is transformedby each homography Hs and Hp into a pixel xt in thetemplate image.

3.2. The energy function

The projection error is minimized by using theLucas-Kanade algorithm which is a robustness directalignment method against noise. This algorithm min-imizes the sum of squared error between a template Tand an input image I warped back into the coordinateframe of the template. Baker and Matthews have pro-posed the inverse compositional Lucas-Kanade [2] thatswitches the role of the template T and the input im-age I . This methods also allows to pre-compute sev-eral parts like the Hessian which improves the computa-tional time during the incremental warp. Since we needto handle a surface texture and a projector texture, wepropose to create an energy function E depending onthe projector’s energy function Ep and the surface’s en-ergy function Es. E is then defined as follows:

E = Es + Ep, (3)

with

Es =∑x

[T (W (x;∆s))− I(W (x; s))]2, (4)

Ep =∑x

[T (W (x;∆p))− I(W (x; p))]2, (5)

The template image T is then defined as the mix of thetemplate image for the surface Ts and the template im-age for the projector Tp.

3.3. Minimization

The goal of our method is to minimize Eq.(3) withrespect to ∆s and ∆p, and to update the warping func-tions as follows:

W (x; s)←W (x; s) ◦W (x;∆s)−1, (6)

W (x; p)←W (x; p) ◦W (x;∆p)−1. (7)

∆s and ∆p can be derived from Eq.(3) as follows:

∆s = He−1s

∑x

[∇Ts∂W

∂s]T [I(W (x; s))−T (x)] (8)

Page 3: Calibration-Free Projector-Camera System for Spatial ...fdesorbi/missbiblio/NdSMS12/NdSMS12.pdf · {takayuki,fdesorbi,sandy,saito}@hvrl.ics.keio.ac.jp Abstract Spatial augmented reality

∆p = He−1p

∑x

[∇Tp∂W

∂p]T [I(W (x; p))−T (x)] (9)

with the Hessian matrices Hes:

Hes =∑x

[∇Ts∂W

∂s]T [∇Ts

∂W

∂s] (10)

and Hep:

Hep =∑x

[∇Tp∂W

∂p]T [∇Tp

∂W

∂p] (11)

and where ∇Ts∂W∂s and ∇Tp

∂W∂p are the steepest de-

scent images. Details about the derivation can be foundin [2].

The Hessian matrices Hes,Hep and the steepestdescent images ∇Ts

∂W∂s and ∇Tp

∂W∂p can be pre-

computed because they do not depend on the parame-ters s, p. ∆s and ∆p are also respectively independentof the parameter p and s in the Eq.(8) and Eq.(9), soeach parameter can be independently computed. Sincethe coordinates of the corners of the homography s andp are directly employed as the warp parameter in our al-gorithm described in the Alg. 1, the computation of thewarp inversion becomes:

s = s−∆sp = p−∆p

(12)

Algorithm 1 Our projector-based LK algorithm.Require: T, s, p

Precompute: ∇TsδWδs ,∇Tp

δWδp ,HessiansHes Hep

repeatWarp I with W (x; s) to compute I(W (x; s))Compute I(W (x; s))− TCompute ∆s with equation 8s← s−∆sWarp I with W (x; p) to compute I(W (x; p))Compute I(W (x; p))− TCompute ∆p with equation 9p← p−∆p

until residual image < thresholdreturn: s, p

3.4. Initialization and update of the homogra-phies

Before running the Lucas-Kanade algorithm, weneed to initialize the homographies Hp and Hs. First,we manually detect the surface texture in the camera im-age and compute the homography Hs that transforms it

to the template space. Similarly, we manually detect theprojected texture in the camera image and compute thehomography Hp. The projected texture can then fit thesurface texture by computing and applying the homog-raphy H:

H = Hp ×H−1s (13)

Our projector Lucas-Kanade algorithm estimates thehomographies Hs and Hp that respectively transformsprojected texture defined in the template space to sur-face texture and the projected texture in the cameraspace. At the end of this minimization process, theprojected texture defined in the template space is trans-formed by applying the homography H updated as fol-lows:

H ← Hp ×H−1s ×H, (14)

After updating the homography H , the projected patternwill fit the surface texture until either the projector, thecamera or the target planar surface are moved.

Figure 3. Results of our experiments.

4. Results

Our system is composed of a LCD projector EPSONEB-X8 and a PGR Flea3 camera with 640×480 res-

Page 4: Calibration-Free Projector-Camera System for Spatial ...fdesorbi/missbiblio/NdSMS12/NdSMS12.pdf · {takayuki,fdesorbi,sandy,saito}@hvrl.ics.keio.ac.jp Abstract Spatial augmented reality

olution that are set to form an uncalibrated projector-camera system like in Fig.2. Under this configuration,we achieved 3 frames per second on a Intel Core2 Duo2.80GHz. Our method has been tested under two situa-tions. First, we translated and rotated the textured targetsurface in every direction. Second, we moved the pro-jector and camera. Results are presented in Fig. 3 withthree different experiments.

4.1. Evaluation by RMSE

The accuracy of our tracking was evaluated with theRoot Mean Square Error. RMSE is computed in theregion of interest as follows:

RMSE =

√(I −H−1

p Tp −H−1s Ts

2/pix) (15)

where RMSE is the Root Mean Square Error and pixis the number of pixels in the region of interest. The plotof the RMSE is presented in Fig.4. The convergenceof the RMSES, computed for five non-consecutiveframes, indicates that the homographies are correctlyestimated.

Figure 4. Result of the RMSE for 5 frames.

5. Conclusion

In this paper we have presented a method to projecta pattern on a moving textured planar surface using anuncalibrated projector-camera system. The target pla-nar surface can be freely moved while the projected im-age will constantly fit the surface even if the projectoror the camera are shifted. Our results have shown thatthe surface and the projector images can be both trackedbut not yet in real-time. We are then currently trying toupdate our method for GPGPU to speed-up the process.

6. Acknowledgments

We would like to thanks Yuji Oyamada, Julien Pilet,Ismael Daribo and Bruce Thomas for their support.

References

[1] S. Audet, M. Okutomi, and M. Tanaka. Direct imagealignment of projector-camera systems with planar sur-faces. In Proceedings of the Conference on ComputerVision and Pattern Recognition, pages 303–310, 2010.

[2] S. Baker and I. Matthews. Lucas-kanade 20 years on: Aunifying framework. International Journal of ComputerVision, 56(3):221–255, 2004.

[3] D. Bandyopadhyay, R. Raskar, and H. Fuchs. Dynamicshader lamps: Painting on movable objects. In Inter-national Symposium on Augmented Reality, pages 207–216, 2001.

[4] H. Bay, T. Tuytelaars, and L. Van Gool. Surf: Speededup robust features. European Conference on ComputerVision, pages 404–417, 2006.

[5] S. Benhimane and E. Malis. Real-time image-basedtracking of planes using efficient second-order mini-mization. In International Conference on IntelligentRobots and Systems, volume 1, pages 943–948, 2004.

[6] O. Bimber and R. Raskar. Spatial augmented reality.AK Peters, 2005.

[7] D. Cotting, M. Naef, M. Gross, and H. Fuchs. Embed-ding imperceptible patterns into projected images for si-multaneous acquisition and display. In InternationalSymposium on Mixed and Augmented Reality, pages100–109, 2004.

[8] S. Gupta and C. Jaynes. Active pursuit tracking in aprojector-camera system with application to augmentedreality. In Proceedings of the Conference on ComputerVision and Pattern Recognition, page 111, 2005.

[9] V. Lepetit, P. Lagger, and P. Fua. Randomized trees forreal-time keypoint recognition. In Proceedings of theConference on Computer Vision and Pattern Recogni-tion, volume 2, pages 775–781, 2005.

[10] M. Leung, K. Lee, K. Wong, and M. Chang. Aprojector-based movable hand-held display system. InProceedings of the Conference on Computer Vision andPattern Recognition, pages 1109–1114, 2009.

[11] D. Lowe. Distinctive image features from scale-invariant keypoints. International journal of computervision, 60(2):91–110, 2004.

[12] D. Molyneaux, H. Gellersen, G. Kortuem, andB. Schiele. Cooperative augmentation of smart objectswith projector-camera systems. UbiComp 2007: Ubiq-uitous Computing, pages 501–518, 2007.

[13] R. Raskar, J. Van Baar, P. Beardsley, T. Willwacher,S. Rao, and C. Forlines. ilamps: geometrically awareand self-configuring projectors. In ACM SIGGRAPH2005 Courses, page 5, 2005.

[14] R. Raskar, G. Welch, M. Cutts, A. Lake, L. Stesin, andH. Fuchs. The office of the future: A unified approachto image-based modeling and spatially immersive dis-plays. In Proceedings of the 25th annual conference onComputer graphics and interactive techniques, pages179–188, 1998.