oe_51_2_021106

8
New technique of three-dimensional imaging through a 3-mm single lens camera Sam Y. Bae Ron Korniski Allen Ream Hrayr Shahinian Harish M. Manohara Downloaded from SPIE Digital Library on 27 Mar 2012 to 59.160.155.67. Terms of Use: http://spiedl.org/terms Downloaded From: http://spiedigitallibrary.org/ on 05/05/2014 Terms of Use: http://spiedl.org/terms

Upload: rwong1231

Post on 05-May-2017

222 views

Category:

Documents


6 download

TRANSCRIPT

Page 1: OE_51_2_021106

New technique of three-dimensionalimaging through a 3-mm single lenscamera

Sam Y. BaeRon KorniskiAllen ReamHrayr ShahinianHarish M. Manohara

Downloaded from SPIE Digital Library on 27 Mar 2012 to 59.160.155.67. Terms of Use: http://spiedl.org/termsDownloaded From: http://spiedigitallibrary.org/ on 05/05/2014 Terms of Use: http://spiedl.org/terms

Page 2: OE_51_2_021106

New technique of three-dimensional imaging through a3-mm single lens camera

Sam Y. BaeCalifornia Institute of TechnologyJet Propulsion LaboratoryPasadena, California 90041and

University of California at Los AngelesLos Angeles, CaliforniaE-mail: [email protected]

Ron KorniskiCalifornia Institute of TechnologyJet Propulsion LaboratoryPasadena, California 90041

Allen ReamMontana State UniversityBozeman, Montana

Hrayr ShahinianSkull Base InstituteLos Angeles, California

Harish M. ManoharaCalifornia Institute of TechnologyJet Propulsion LaboratoryPasadena, California 90041

Abstract. We present a technique for imaging full-color 3-D images with asingle camera in this paper. Unlike a typical 3-D-imaging system compris-ing two independent cameras each contributing one viewpoint, the tech-nique presented here creates two viewpoints using a single-lens camerawith a bipartite filter whose bandpass characteristics are complementaryto each other. The bipartite filter divides the camera’s limiting aperture intotwo spatially separated apertures or viewpoints that alternately image anobject field using filter-passband matched, time-sequenced illumination.This technique was applied to construct a 3-D camera to image scenesat a working distance of 10 mm. We evaluated the effectiveness of the3-D camera in generating stereo images using statistical comparison ofthe depth resolutions achieved by the 3-D camera and a similar 2D cameraarrangement. The comparison showed that the complementary filters pro-duce effective stereopsis at prescribed working distances. © 2012 Society ofPhoto-Optical Instrumentation Engineers (SPIE). [DOI: 10.1117/1.OE.51.2.021106]

Subject terms: 3-D imaging; stereo endoscope; multispectral imaging; minimallyinvasive neurosurgery; skull base surgery; spectral illumination band (SIB); comple-mentary multi-bandpass filters.

Paper 110613SS received Jun. 1, 2011; revised manuscript received Sep. 19,2011; accepted for publication Sep. 23, 2011; published online Mar. 7, 2012.

1 IntroductionHumans can perceive depth because they have two baseline-separated viewpoints, one from the left eye and the otherfrom the right eye. Depth perception in humans occursbecause of the dual-eye overlap region of 130 degrees, ofwhich the central angle-of-view of around 40 to 60 degrees,has the maximum impact. A conventional “3-D” imagingsystem mimics this stereoscopic effect by employing at leasttwo cameras that are spatially separated with overlappingfield-of-view (FOVs). However, this arrangement is too bulkyfor operation within extremely confined spaces (<5-mm sidecube) such as those encountered in minimally invasive neuro-surgery (MIN).

The objective of the concept presented here is to deliverthe most satisfying stereo images with high resolution usingthe smallest possible diameter for a 3-D endoscopy. Note thatthis technique requires an observer (for MIN, this would besurgeons and medical professionals) to wear polarized glassesto perceive 3-D images. The 4-mm endoscope capable of2D imaging is a popular choice for MINs because of itsslim design and satisfactory optical resolution. A 4-mmdiameter endoscope offers sufficient room to introduce othersurgical tools within the limited surgical entry area, typicallya dime-sized opening in the skull, for MINs. Therefore, it isdesirable from the user’s perspective to produce 3-D endo-

scopes (stereo endoscopes) with similar dimensions as thoseof presently used 2D endoscopes.

In an effort to decrease the size of a 3-D imaging system,several past works have proposed the use of a single camerato capture two viewpoints for producing stereopsis.1–4 Inorder to create the two viewpoints, the limiting apertureof the single objective lens is halved to form dual apertures,and a mechanism is installed to alternately open and closeeach of the half-apertures. This approach has two mainadvantages: 1. both of the viewpoints always converge toa point, and 2. the entire image plane is available for eachviewpoint because of time-multiplexed operation. The firstadvantage closely mimics the “vergence” of the human bin-ocular arrangement, i.e., the ability to rotate both eyes towarda point so that the projection of the image is in the center ofthe retina in both eyes. The second advantage shows that theimage resolution is not compromised as in the case of a two-camera 3-D system, in which, at best, only half the imageplane is used for each viewpoint.

In this regard, several techniques have been proposed inthe past. Some methods use liquid crystal and electropticalpolarizers to block and open the half apertures to form thetwo viewpoints,1,2 but these suffer from optical crosstalk andlight loss problems. Mechanical actuation using shutters hasbeen reported elsewhere to produce the two viewpoints.4

However, this method is not only challenging to implementin a confined space but also is prone to mechanical failure.Other techniques employ bi-prisms and lenticular lenses.5,60091-3286/2012/$25.00 © 2012 SPIE

Optical Engineering 51(2), 021106 (February 2012)

Optical Engineering 021106-1 February 2012/Vol. 51(2)

Downloaded from SPIE Digital Library on 27 Mar 2012 to 59.160.155.67. Terms of Use: http://spiedl.org/termsDownloaded From: http://spiedigitallibrary.org/ on 05/05/2014 Terms of Use: http://spiedl.org/terms

Page 3: OE_51_2_021106

However, these techniques require the detector to be parti-tioned into two to record the stereoscopic pair of imagesfrom the two viewpoints simultaneously. This has a draw-back in that the partitioned half requires the pixel densityto be doubled from that of the whole imager in order to main-tain the same resolution. The technique presented here strivesto overcome the problems of previously reported methodswhile adhering to the single-lens, split-aperture approachto generate stereo images.

2 Materials and Methods

2.1 Creating Two Viewpoints with a Single-LensCamera

The technique reported here can be understood from Figs. 1and 2. It is based on the use of complementary multi-bandbandpass filters (CMBFs) incorporated within a split aper-ture (two half apertures) each with triple passbands locatedin the red, green, and blue (RGB) regions of the full colorspectrum.7. This is coupled with spectral illumination bands(SIBs) each of which is matched with one of the passbands ofthe CMBFs. By sequentially illuminating the object fieldwith SIBs, one or the other viewpoint, depending on theSIB itself, is imaged. The SIBs in relation to the transmissionbands of the bipartite filters are shown in Fig. 2 using idealwaveforms. Current multiple bandpass interference filterscan produce waveforms that are very close to the sharpcut-on and cut-off characteristics of the ideal waveforms.This property allows us to spectrally “interleave” the pass-bands of each filter with negligible crosstalk (spectral over-lap) between the complementary filters’ passbands. Using amonochromatic camera, the technique generates a total of sixtime-multiplexed, multispectral images, with each viewpointgenerating three spectral RGB images for color rendition.

Typically, a 3-D imaging system arranged with two cam-eras has a depth resolution, dZc, defined by the followingequation8:

dZc ¼Z2 · ΔTM · b

; (1)

where z is the working distance,ΔT is stereo acuity,M is mag-nification, and b is the baseline distance between the two

viewpoints of a 3-D camera. This equation applies to anystereo imaging system providing two viewpoints for depthinterpretation. Therefore, the equation also applies to thestereo technique presented here. The baseline, b, is achievedby the inter-pupil distance created by the bipartite filter. It iswell known that an object viewed from two different anglesproduces two perspective images, left and right. When thesetwo perspective images are seen exclusively by the corre-sponding human eyes (that is, a left-perspective image seenonly by the left eye and a right-perspective image seenonly by the right eye), the depth perception is reproduced.The baseline of the bipartite filter gives the two angularviews of the same object, and the CMBF filters incorpo-rated in the bipartite filter provide mutually exclusive view-points when the corresponding SIB illuminates the object.Hence it can be argued that the condition for stereoimaging shown In Eq. (1) is satisfied in the techniquepresented here.

2.2 Validating Stereo Imaging from a Single-lens,Dual-Aperture System

The technique was verified in two steps. The first step ver-ified the effectiveness of the CMBF pair’s blocking or trans-mitting SIBs. Two commercial off-the-shelf (COTS) cameraswere used to test this. In the second step, we statisticallytested the ability of the dual-aperture 3-D camera to generatestereo images. The statistical test protocol involved the abil-ity of human subjects to resolve depths when simultaneouslyviewing two objects placed at different distances. This depthresolution was compared to that from a 2D camera under thesame experimental settings.

(a) (b)

Fig. 1 Dual aperture created by a pair of complementary multi-band-pass filters (CMBFs). (a) Schematic drawing showing the CMBF pairopening to spectral illumination bands (SIBs). (b) Ray traces ofthe CMBF lens system simulated by an optical simulation softwarepackage.

Fig. 2 Spectral scheme showing the operating principle of CMBF-based stereo imaging technique. (a) Transmission characteristicof the CMBFs. (b) Time-multiplexed SIBs synchronized to thetransmission bands of the CMBFs.

Bae et.al.: New technique of three-dimensional imaging through a 3-mm single lens camera

Optical Engineering 021106-2 February 2012/Vol. 51(2)

Downloaded from SPIE Digital Library on 27 Mar 2012 to 59.160.155.67. Terms of Use: http://spiedl.org/termsDownloaded From: http://spiedigitallibrary.org/ on 05/05/2014 Terms of Use: http://spiedl.org/terms

Page 4: OE_51_2_021106

For the first verification, two 25-mm cameras9 wereplaced to image the same target, each placed with one ofthe CMBF pair10 in front of the objective lens. In thisarrangement only one of the two cameras is expected toimage each of the SIBs. The images were recorded aspixel intensity values. A tunable filter11 was used to producespecific SIBs from a broadband Xenon lamp. To avoid cam-era saturation, a light background was placed to reflect anddiffuse the visible light illumination, and the two cameraswere placed 45 degrees to the plane of the white backgroundas shown in Fig. 3.

The results of this experiment are shown in Table 1, dis-playing the pixel intensity values of the two cameras. Theintensities are scaled in red and blue, with red indicatinghigher intensity. The contrast between the two cameras ateach SIB was between 400∶1 and 28000∶1. This verifiesthat the interleaved filters let only one camera image eachSIB. At this point, it is important to note that the actual trans-mission and illumination band characteristics of the COTSCMBFs and the SIBs are not as ideal as those depicted inFig. 2. Their true forms are shown in Fig. 4. The spreadingat the bottom of the bellcurve of the illumination bandssometimes overlaps an adjacent CMBF band. This causedcrosstalk between the two viewpoints as can be seen inthe spotty pixel intensity values at 520 nm in Table 1. Lastly,the fringes in the images in the table are a consequence ofnon-uniform light emission projected from the tunable filter.

In the second step, to verify stereo imaging capability ofthe dual-aperture system, a statistical test involving humansubjects was set up (Fig. 5). The statistical test protocolinvolved the ability of human subjects to resolve depthswhen simultaneously viewing two paper clips placed at dif-ferent distances from the dual-aperture camera. This depthresolution was compared to that from a single-aperture 2Dcamera under the same experimental settings. The 2D envi-

ronment was kept the same as the 3-D by simply removingthe bipartite filter from the lens system. One paperclip wassequentially placed at fixed distances of 3, 6, 9, 12, 15, 18,and 21 mm from the camera, while the second paperclip wascloser to and away from the fixed paperclip in increments of50 microns. That is if we designate any given position of the

Fig. 3 A setup to test the effectiveness of the CMBFs as a shutter forthe SIBs. (a) A schematic arrangement of the setup. (b) A picture ofthe actual setup.

Table 1 Intensity maps captured by the two individual cameras. Theintensity values are scaled from red being the most intense to bluebeing the least.

WL (nm) Left Camera Right Camera Contrast Ratioa

450 12000∶1

480 23000∶1

520 400∶1

560 28000∶1

600 2400∶1

640 700∶1

aThe ratios are calculated by dividing the greatest to least averagepixel value of the center regions.

Fig. 4 Spectral transmissions of a pair of actual CMBFs. The bellcurves superimposed in the transmission are the SIBs.

Bae et.al.: New technique of three-dimensional imaging through a 3-mm single lens camera

Optical Engineering 021106-3 February 2012/Vol. 51(2)

Downloaded from SPIE Digital Library on 27 Mar 2012 to 59.160.155.67. Terms of Use: http://spiedl.org/termsDownloaded From: http://spiedigitallibrary.org/ on 05/05/2014 Terms of Use: http://spiedl.org/terms

Page 5: OE_51_2_021106

first paperclip as “zero,” then the second paperclip is movedrelative to the first paperclip from −2 mm to þ2 mm, withthe negative sign indicating the front side of the first paper-clip (closer to the camera). A total of 16 human subjects par-ticipated in the test and were asked to select the best set ofimages from both the 3-D and 2D arrangements that showedboth paperclips to be at the same distance from the camera.The residual distance between the two paperclips became thedepth resolution. The acquired stereo images were displayedon a special 3-D display that superimposes the stereo imagesusing orthogonally polarized light. An observer wore a pairof 3-D glasses containing polarizers matching the images onthe display for evaluating the depth perception. The results ofthis experiment are discussed below.

2.3 Miniaturized 3-mm CMBF Stereo ImagingSystem

To demonstrate the feasibility of this technique to develop asmaller, MIN-conducive stereo endoscope system, a first-generation 3-mm lens system was designed from availableCOTS lens elements and custom-fabricated bipartite CMBFfilter, and packaged inside a rapid prototyped plastic housing.This was to demonstrate the possibility of miniaturizing a stereocamera into a familiar 4-mm endoscope. [Fig. 6(a)]. Individual

filters for the miniature lens system were produced by litho-graphically patterning apertures on larger triple-bandpassfilter disks and dicing them into 3 × 1.5 mm rectangular ele-ments. The bipartite CMBF pair was then created by joiningcomplementary halves along the edge as shown in Fig. 6(b).The apertures were approximately 800 μm in diameter andwere separated by 1.2 mm center-to-center to achieve a max-imum separation (hence the best disparity) for the givendimension.

An objective lens system was designed from COTS lenselements to image scenes approximately 10 mm away. Thelens arrangement was designed to minimize the transmissionloss and passband shift suffered by the CMBFs due to stee-ply angled incident light. The ray-trace diagram on the rightof Fig. 1 shows the placement order of the lens elements. Thenegative lens12 provided a 52-degree FOV. The collimator13

parallelized the angled light exiting from the negative lens,followed by the bipartite CMBF filter. The last element, theachromatic doublet,14 focused different wavelengths onto thesame focal plane. All of these elements were assembledinside a rapid prototyped plastic housing with a rectangularcross section. This cross section was chosen to match theshape of the bipartite CMBF filters. The housing hadbuilt-in spacers to position each lens element with respectto the others. The translucent housing was wrapped in alu-minum foil to block ambient light [Fig. 6(c)]from entering.

A ring light15 that fit tightly around the lens system pro-vided the frontal illumination. A light guide was used to deli-ver the SIBs to the scene from the tunable filter used in theearlier setup. The proof-of-concept system was designed toinclude an image sensor at its focal plane. For demonstrationpurposes, a COTS digital camera16 was modified such thatthe camera’s own lens was replaced by the lens design of theCOTS lens elements described above. This allowed the useof camera’s image sensor to capture stereo images generatedby the custom CMBF stereo lens system.

A 3-D display17 was selected to display the 3-D imagesgenerated from the 3-mm lens system. The 3-D display wasconfigured with two displays, each projecting one of the 3-Dimages simultaneously onto a beam splitter with oppositelypolarized light. An observer wearing a pair of polarizedglasses with polarization matched to that of the display wasable to see stereo images with depth perception.

3 Results and DiscussionFigure 7 plots the mean values of the depth resolutions (DRs)resolved by the human subjects observing the paper clipsthrough the dual-aperture 3-D camera and through thesingle-aperture 2D camera. Note in the figure the overallupward trend in the DRs against the working distances(WDs). It is evident that the subjects can resolve DRs betterthrough the dual-aperture 3-D camera than they can throughthe single-aperture 2D camera up to the intended WD of thecameras, which is 10 mm. It also shows that, beyond 12 mm,the two DRs become close, indicating that the object is out ofthe WD range of the cameras. One-way analysis of variances(ANOVA) was applied to the data from each imaging setupto confirm the apparent correlation between DRs and WDs(3D∶P ¼ 1.06E-7, 2D∶P ¼ 5.16E-4, Table 2).

Next, paired two-tail T-tests (α ¼ 0.05) were applied tocorresponding DRs of the dual- and single-aperture camerasat each WD. The mean values of the two DRs at the 6 mm

Fig. 5 An actual setup for the statistical test. The testing environ-ments for the 3-D and 2D are the same except that the CMBF isremoved in the 2D case.

Fig. 6 A miniaturized CMBF stereo endoscope arrangement showingCOTS 3-mm lens elements inside a rapid prototyped housing. (a) Anopened rapid-prototyped housing with arranged 3-mm lenses on aUS quarter. (b) Diced CMBF-halves with lithographically patternedapertures joined at the flat edge. (c) An assembled 4 × 4 × 12 mmrapid-prototyped housing with lenses and CMBFs.

Bae et.al.: New technique of three-dimensional imaging through a 3-mm single lens camera

Optical Engineering 021106-4 February 2012/Vol. 51(2)

Downloaded from SPIE Digital Library on 27 Mar 2012 to 59.160.155.67. Terms of Use: http://spiedl.org/termsDownloaded From: http://spiedigitallibrary.org/ on 05/05/2014 Terms of Use: http://spiedl.org/terms

Page 6: OE_51_2_021106

and 9 mm positions are significantly different (P ¼ 0.00592,P ¼ 0.0171 respectively, Table 3) but those at all other dis-tances are statistically the same (P > 0.05). Note that theDR at 12 mm is marginally different (P ¼ 0.0610 > 0.05).In conclusion, the statistical analysis shows that the subjectscan resolve DRs better through the 3-D camera than throughthe 2D camera at 6 mm and 9 mm and marginally better at12-mm.

The dual-aperture 3-D camera can yield better DRs whenit can match human stereo acuity (SA). The table in Fig. 7shows the SAs calculated from the empirical DRs of thedual-aperture camera using Eq. (1). They are averaged tobe 226 arc-sec (the SA of the 3 mm was omitted in the aver-age as an outlier), which is about a factor of 11 times higherthan an average person’s SA of 20 arc-sec.8 The large SA iscaused by several factors such as the small baseline, the less-than-optimum imagery of the COTS lens design, and the

performance of the lens outside the camera’s intended WDs.The baseline distance between the centers of the two view-points used in the calculations was 0.35 mm, which is toosmall to yield two distinct stereoscopic viewpoints. Secondly,just as human visual acuity can affect the SA, the imagingdefinition of each of the stereoscopic viewpoints plays a rolein determining the SA. We used a coarsely cut and joinedbipartite filter at the limiting aperture and used a lensmount that does not yield precise gaps among the lens ele-ments. Lastly, the object was out of focus when it was placednearer to or farther than the camera’s intended WD. This wasnoticeable when the paper clips were placed at the 3 mmposition as is evident in its DR. However, all these can beimproved by custom-designed optics.

Another result verifying the CMBFs’ ability to generatestereo images is shown in Fig. 8. The imaging subject is aprinted circuit board with discrete electronic components.The left and the right images generated by the two view-points were captured by the entire image plane because ofthe time-multiplexing technique adapted here. To best depictthe 3-D effect of the image generated by the concept pre-sented here, a depth-map program was used to show per-ceived depth as would be experienced by an observer in a2D projection format. While this approach is prone to erro-neous data interpretation along the periphery of the image, itgives credible evidence of providing stereo images. The pro-gram generates proportional intensity values correspondingto perceived depth distances [Fig. 8(a)]. The intensity of eachpixel was determined by calculating the disparity or the dif-ference between the two corresponding points in each of theimage fields captured by right and left viewpoints. Once thetwo points were found, the distance from the image plane tothe point in the object field was found by applying a methodof triangulation. This depth map provides objective confir-mation of the existence of the two viewpoints in a singleobjective lens. It can be deduced that if the two viewpointsin the 3-D camera were identical without any depth variation,the depth map would show uniform intensity values or aplain white depiction.

In this proof-of-concept demonstration, it was noticed thatthe colors produced in the image were not ideal representa-tions of the colors of the real object. There is a difference incorresponding colors between the left and the right images[Fig. 8(b) and 8(c)]. This was caused by the passbands of theCMBFs not covering the entire visible spectrum. For exam-ple, as seen in Fig. 4, bandpasses of one of the CMBFs omit a

Fig. 7 A bar graph comparing the means of the depth resolutions asperceived by human subjects viewing images of a stationary paperclip with respect to a paper clip in motion from a dual-aperture 3-Dcamera, and that from a single-aperture 2D camera.

Table 2 One-way ANOVA applied to each of the dual- and single-aperture data.

Source SS df MS F P-value FF critical

ANOVA of the dual-aperture 3-D camera

Btw. Grp. 11.8 6 1.97 8.71 1.06E-7 2.19

Within Grp. 23.8 105 0.227

ANOVA of the single-aperture 2D camera

Btw. Grp. 7.37 6 1.23 4.41 5.16E-4 2.19

Within Grp. 29.2 105 0.278

Bae et.al.: New technique of three-dimensional imaging through a 3-mm single lens camera

Optical Engineering 021106-5 February 2012/Vol. 51(2)

Downloaded from SPIE Digital Library on 27 Mar 2012 to 59.160.155.67. Terms of Use: http://spiedl.org/termsDownloaded From: http://spiedigitallibrary.org/ on 05/05/2014 Terms of Use: http://spiedl.org/terms

Page 7: OE_51_2_021106

large red band while adequately covering the green and theblue bands. This omission leads to partial color dominationin the RGB color-rendering scheme, yielding an image withbiased colors. While this color bias is somewhat corrected bythe human brain when left and right images are overlapped,higher fidelity in color reproduction can be achieved by care-fully selecting the passband ranges of CMBFs.18,19 Thisapproach is currently under development.

4 ConclusionsWe have successfully demonstrated the proof-of-concept fora no-moving-part, compact stereo imaging camera intendedfor operation at working distances of approximately 10 mm.The enabling technology is the use of CMBFs in a split-aperture configuration. Statistical analysis of stereo imagesgenerated by our camera and showed it to have superior per-

formance over the 2D camera arranged with the same set forworking distances in the range of 6 to 12 mm.

Implementation of this method in a 3-mm objective lenswas presented as a step towards miniaturization of this stereoimaging system for minimally invasive neurosurgery appli-cations. In this paper, we showed miniaturization of astereo camera's lens using CMBFs as a first-generation pro-totype. But a clear pathway exists to achieve complete min-iaturization of the system. For example, the illuminationcan be integrated into the housing as currently practicedin 2D endoscopes. The image sensors can be obtainedas a sensor chip and can be mounted into the lens housingat a precisely designed location. Accomplishing these twotasks significantly advances the technology towards completeminiaturization.

AcknowledgmentsThis research was carried out under funding from the SkullBase Institute of Los Angeles, California. We thank Mr.Victor White and Dr. Kirill Scheglov who inspired us to con-duct this research. We are grateful to Dr. Pantazis Mouroulisat JPL whose constructive comments helped shape this work.We also thank Mr. Robert Kowalczyk of JPL for his assistancein the laboratory. Mr. Sam Bae gives special thanks to Profes-sor Harold Monbouquette of UCLA for his support and guid-ance of the project for his PhD work at UCLA. This researchwas carried out at the Jet Propulsion Laboratory, CaliforniaInstitute of Technology, under a contract with the NationalAeronautics and Space Administration. Government sponsor-ship acknowledged.

References

1. J. I. Shipp, “Single lens stereoscopic video camera,” U. S. Patent5471237 (1995).

2. R. A. Lia, “Endoscope or borescope stereo viewing system,” U. S.Patent 5222477 (1993).

3. M. Weissman et al., “Single-axis stereoscopic video imaging systemwith centering capability,” U. S. Patent 6624935 , B2 (2003).

4. A. B. Greening and T. N. Mitchell, “Stereoscopic viewing system usinga two dimensional lens system,” U. S. Patent 5828487 (1998).

5. D. Lee and I. Kweon, “A novel stereo camera system by a biprism,”IEEE Trans. Robot. Autom. 16(5), 528–541 (2000).

6. A. Yaron, “Blur spot limitations in distal endoscope sensors,” Proceed-ings of SPIE 6055, 605509–605509-7 (2006).

7. H. Shahinian et al., “Stereo imaging miniature endoscope with singlechip and conjugated multi-bandpass filters,” U. S. Patent 2011/0115882 A1 (2011).

Table 3 Two-tail T-tests applied to corresponding DRs of the dual- and single-aperture data.

3-mm 6-mm 9-mm 12-mm 15-mm 18-mm 21-mm

Mean (M)a 0.0531 0.325 0.525 0.331 −0.0281 −0.0531 −0.175

Std. Error 0.114 0.101 0.196 0.163 0.205 0.191 0.255

Median 0.05 0.275 0.350 0.325 −0.150 −0.225 −0.100

Std. Dev. 0.456 0.406 0.783 0.654 0.818 0.764 1.02

C.I. 0.243 0.216 0.417 0.348 0.435 0.407 0.544

Two tail T 0.466 3.20 2.68 2.03 −0.138 −0.278 −0.685

P-value 0.648 5.92E-3 0.0171 0.0609 0.892 0.785 0.504

aMean difference, an average of the differences derived from subtracting individual data points in the single aperture by those in the dual aperture.

Fig. 8 (a) A 2D projection depth map of a stereo image captured bythe dual-aperture 3-D camera. The scale on the right shows the depth.The darker the gray scale, the farther the point from the camera:(b) left image, (c) right image.

Bae et.al.: New technique of three-dimensional imaging through a 3-mm single lens camera

Optical Engineering 021106-6 February 2012/Vol. 51(2)

Downloaded from SPIE Digital Library on 27 Mar 2012 to 59.160.155.67. Terms of Use: http://spiedl.org/termsDownloaded From: http://spiedigitallibrary.org/ on 05/05/2014 Terms of Use: http://spiedl.org/terms

Page 8: OE_51_2_021106

8. M. Kytö, M. Nuutinen, and P. Oittinen, “Method for measuring stereocamera depth accuracy based on stereoscopic vision,” Media 7864,78640I–78640I-9 (2011).

9. Monochromatic cameras, Model No. DMB21BU04.H, ImagingSource, Inc., Charlotte, NC.

10. Quadruple -band Bandpass Filters, Model No. FF01-390/482/563/640-25 and FF01-440/521/607/700-25, Semrock Inc., Rochester, NY.

11. Verispec, CRI Inc., Woburn, MA.12. Plano-Concave Lens, G314-000-000, Qioptiq, Fairport, NY.13. Plano-Convex Lens, 3.0 mmDia × 6.0 mmFL, NT32-953, Edmund

Optics Inc.14. Achromatic Doublet Lens, 3.0 mmDia × 6.0 mmFL, NT45-089,

Edmund Optics Inc.15. 0.83” ID Ring Light Guide, NT54-176, Edmund Optics Inc.16. USB2.0 0.36 M Mono MINI Camera, CMOS 640 × 480 VGA pixel

resolution with 6.5 μm square pixels, Artray Co. LTD, Tokyo, Japan.17. SD1710 Stereo/3-D Display, Planar System Inc., Beaverton, OR.18. J. K. Hovis, “Review of Dichoptic Color Mixing,” Optom. Vis. Sci.

66(3), 181–190 (1989).19. Y. J. Jung et al., “Quantitative measurement of binocular color fusion

limit for non-spectral colors.,” Optic. Exp. 19(8), 7325–38 (2011).

Sam Bae joined the engineering staff at JPL in 2000. He has a BS inengineering physics from the University of California, Berkeley, a MSin mechanical engineering from Purdue University, and a MS in bio-medical engineering from the University of California, Los Angeles.

Ron Korniski joined the engineering staff at JPL in 2008. He has aBS in mathematics and physics from Western Michigan University,Kalamazoo, MI; an MS in optical sciences from the University ofArizona, Tucson, AZ; and an MBA from California State University,Pomona, CA. He previously held positions with ITEK, RockwellInternational, Optical Research Associates, OPTICS 1, and ScienceApplications International Corporation. He has been a member ofSPIE since the late 1970s.

Allen Ream grew up in Alaska and graduated from high schoolin 2007. He studied mechanical and aerospace engineering andmathematics at Montana State University. While still an undergrad,he was granted an internship at NASA JPL through the Space Grantprogram.

Hrayr K. Shahinian, MD, is director of The Skull Base Institute. Priorto creating the Skull Base Institute on the West Coast in 1996,Dr. Shahinian served as co-director of the Skull Base Institute atthe State University of New York at Stony Brook, and an assistantprofessor of surgery and neurosurgery. Dr. Shahinian earned hisBS in biology and chemistry, and his MD from the American Universityof Beirut and the University of Chicago. He completed his residenciesat Vanderbilt University Medical Center and New York UniversityMedical Center. He then completed two fellowships – the first inskull base surgery at the University of Zurich in Switzerland, andthe second in craniofacial surgery at New York University MedicalCenter. He is Board Certified by the American Board of Surgery,and is a fellow of the American College of Surgeons.

Harish M. Manohara received the Bachelor of engineering degree ininstrumentation technology from the Bangalore University, India in1989, followed by an MS degree in nuclear engineering in 1992, anda PhD degree in engineering science in 1997 from the LouisianaStateUniversity, BatonRouge, Louisiana.After servingas anassistantprofessor of research at the Center for Advanced Microstructures andDevices (CAMD) at LSU for three years, he joined JPL in 2000 todevelopadvancedcomponents forTHztechnology. In2005hebecamethe technical group supervisor of theNanoandMicroSystems (NAMS)group at JPL. Dr. Manohara has developed multiple new devices andtechnologies including carbon nanotube field emitters, nanoelectronicdevices, miniature spectroscopic instruments, and MEMS for space,defense, medical, and commercial applications.

Bae et.al.: New technique of three-dimensional imaging through a 3-mm single lens camera

Optical Engineering 021106-7 February 2012/Vol. 51(2)

Downloaded from SPIE Digital Library on 27 Mar 2012 to 59.160.155.67. Terms of Use: http://spiedl.org/termsDownloaded From: http://spiedigitallibrary.org/ on 05/05/2014 Terms of Use: http://spiedl.org/terms