beyond point clouds - 3d mapping and field parameter...

4
Beyond Point Clouds - 3D Mapping and Field Parameter Measurements using UAVs Raghav Khanna, Martin M¨ oller, Johannes Pfeifer, Frank Liebisch, Achim Walter and Roland Siegwart Abstract—Recent developments in Unmanned Aerial Vehicles (UAVs) have made them ideal tools for remotely monitoring agricultural fields. Complementary advancements in computer vision have enabled automated post-processing of images to generate dense 3D reconstructions in the form of point clouds. In this paper we present a monitoring pipeline that uses a readily available, low cost UAV and camera for quickly surveying a winter wheat field, generate a 3D point cloud from the collected imagery and present methods for automated crop height estimation from the extracted point cloud and compare our estimates with those using standardized techniques. Index Terms—Precision agriculture, Micro-Aerial Vehicles, UAVs, crop height, point cloud processing I. I NTRODUCTION Using UAVs for remote and non invasive monitoring of agricultural fields has garnered significant interest in recent times. Considerable effort has been directed towards ob- taining accurate, dense 3D reconstructions and commercial tools [1] have become available providing automated pipelines to produce orthomosaics, digital elevation models and 3D point clouds. However, in order to use these techniques in everyday farm management, crop scientists and farmers need the data in readily digestable formats such as average crop height, canopy cover and above ground biomass within a given area. Crop phenotyping, the application of automated, high throughput methods to characterise plant architecture and performance, is currently a focus in crop research and breeding programmes [2], [3]. Automated methods for extracting the above mentioned and other field indicators from UAV based measurements will enable breeders and agronomists to fre- quently collect real world data about their crops, offering much greater insight into current crop status than has been possible before. In light of these use cases, the primary contribution of this work is the automated estimation of average crop height for a user defined partition of a crop field, given its 3D reconstruction based on images taken with a UAV. The research leading to these results has received funding from the European Community’s framework programme Horizon 2020 un- der grant agreement no. 644227-Flourish. Raghav Khanna is currently PhD student, Martin oller is MSc student, and Roland Siegwart is full professor and head of the Autonomous Systems Lab, at ETH urich. Johannes Pfeifer is post doc, Frank Liebisch is senior re- searcher and Achim Walter is full professor at ETH Z¨ urich and head of the Crop Science Group (email: [email protected], mar- [email protected], [email protected], {johannes.pfeifer, frank.liebisch, achim.walter}@usys.ethz.ch). Fig. 1. The system used for this work. A DJI Phantom UAV with a GoPro R camera flying over a winter wheat field. II. BACKGROUND Crop height estimation using UAVs has previously been attempted using laser rangefinders [4] and sensors mounted on ground vehicles [5]. Laser rangefinders have the advantage of being very accurate within their specified range, however, the lack of corresponding calibrated imagery prevents the information to be integrated into a holistic 3D model of the environment. Ground vehicles can carry a larger payload, however their intrusive nature limits the frequency of the measurements, has an adverse impact on the soil and renders them useful only during the initial plant growth phase for all crops which close the inter-row gap during the season. In light of these considerations, we utilize a system consisting of a high resolution monocular camera and a low cost UAV which can monitor large areas rapidly and generate user readable maps of quantities of interest due to the availability of rich visual information. III. HEIGHT ESTIMATION PIPELINE A. Data Collection and 3D reconstruction The data was collected using a high resolution (11 megapixel) GoPro Hero R 2 camera (GoPro, Inc, USA) with a wide angle (fisheye) lens mounted on a low cost UAV platform (DJI Phantom). The combination of a wide angle lens with a high resolution sensor gives a small ground sampling distance (1 5 cm) while maintaining a high overlap (80%) between consecutive images which are essential in order to obtain satisfactory 3D reconstructions of the plants. Ground control points were placed at known locations on the field in order

Upload: dinhanh

Post on 05-Jul-2018

233 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Beyond Point Clouds - 3D Mapping and Field Parameter ...flourish-project.eu/fileadmin/user_upload/publications/khanna2015... · Beyond Point Clouds - 3D Mapping and Field Parameter

Beyond Point Clouds - 3D Mapping and FieldParameter Measurements using UAVs

Raghav Khanna, Martin Moller, Johannes Pfeifer, Frank Liebisch, Achim Walter and Roland Siegwart

Abstract—Recent developments in Unmanned Aerial Vehicles(UAVs) have made them ideal tools for remotely monitoringagricultural fields. Complementary advancements in computervision have enabled automated post-processing of images togenerate dense 3D reconstructions in the form of point clouds.In this paper we present a monitoring pipeline that uses areadily available, low cost UAV and camera for quickly surveyinga winter wheat field, generate a 3D point cloud from thecollected imagery and present methods for automated crop heightestimation from the extracted point cloud and compare ourestimates with those using standardized techniques.

Index Terms—Precision agriculture, Micro-Aerial Vehicles,UAVs, crop height, point cloud processing

I. INTRODUCTION

Using UAVs for remote and non invasive monitoring ofagricultural fields has garnered significant interest in recenttimes. Considerable effort has been directed towards ob-taining accurate, dense 3D reconstructions and commercialtools [1] have become available providing automated pipelinesto produce orthomosaics, digital elevation models and 3Dpoint clouds. However, in order to use these techniques ineveryday farm management, crop scientists and farmers needthe data in readily digestable formats such as average cropheight, canopy cover and above ground biomass within agiven area. Crop phenotyping, the application of automated,high throughput methods to characterise plant architecture andperformance, is currently a focus in crop research and breedingprogrammes [2], [3]. Automated methods for extracting theabove mentioned and other field indicators from UAV basedmeasurements will enable breeders and agronomists to fre-quently collect real world data about their crops, offering muchgreater insight into current crop status than has been possiblebefore.

In light of these use cases, the primary contribution ofthis work is the automated estimation of average crop heightfor a user defined partition of a crop field, given its 3Dreconstruction based on images taken with a UAV.

The research leading to these results has received funding fromthe European Community’s framework programme Horizon 2020 un-der grant agreement no. 644227-Flourish. Raghav Khanna is currentlyPhD student, Martin Moller is MSc student, and Roland Siegwart isfull professor and head of the Autonomous Systems Lab, at ETHZurich. Johannes Pfeifer is post doc, Frank Liebisch is senior re-searcher and Achim Walter is full professor at ETH Zurich and headof the Crop Science Group (email: [email protected], [email protected], [email protected], {johannes.pfeifer, frank.liebisch,achim.walter}@usys.ethz.ch).

Fig. 1. The system used for this work. A DJI Phantom UAV with a GoPro R©

camera flying over a winter wheat field.

II. BACKGROUND

Crop height estimation using UAVs has previously beenattempted using laser rangefinders [4] and sensors mountedon ground vehicles [5]. Laser rangefinders have the advantageof being very accurate within their specified range, however,the lack of corresponding calibrated imagery prevents theinformation to be integrated into a holistic 3D model of theenvironment. Ground vehicles can carry a larger payload,however their intrusive nature limits the frequency of themeasurements, has an adverse impact on the soil and rendersthem useful only during the initial plant growth phase for allcrops which close the inter-row gap during the season. In lightof these considerations, we utilize a system consisting of ahigh resolution monocular camera and a low cost UAV whichcan monitor large areas rapidly and generate user readablemaps of quantities of interest due to the availability of richvisual information.

III. HEIGHT ESTIMATION PIPELINE

A. Data Collection and 3D reconstruction

The data was collected using a high resolution (11megapixel) GoPro Hero R© 2 camera (GoPro, Inc, USA) with awide angle (fisheye) lens mounted on a low cost UAV platform(DJI Phantom). The combination of a wide angle lens with ahigh resolution sensor gives a small ground sampling distance(1 ∼ 5 cm) while maintaining a high overlap (∼80%) betweenconsecutive images which are essential in order to obtainsatisfactory 3D reconstructions of the plants. Ground controlpoints were placed at known locations on the field in order

Page 2: Beyond Point Clouds - 3D Mapping and Field Parameter ...flourish-project.eu/fileadmin/user_upload/publications/khanna2015... · Beyond Point Clouds - 3D Mapping and Field Parameter

Fig. 2. Dense point cloud of a winter wheat field used for genotype trialsgenerated from aerial images using Pix4Dmapper by Pix4D. The numbersindicate the plot indices used for referencing.

to geolocate the reconstruction and derive scale information.The images were post processed using commercially availablephotogrammetry software Pix4D [1] to obtain dense 3D pointclouds such as the one shown in Figure 2.

For this purpose, the software searches for points that arerecognized in several images in order to estimate their 3Dcoordinates. It further takes into account the positions ofthese points in the single images in order to estimate thecalibration of the camera. The camera model in turn is thenused to optimize the 3D map. In particular it corrects forthe radial distortion introduced by the wide angle lens, suchthat this effect will not negatively influence the quality of thereconstruction.

B. Soil and Plant Segmentation

In order to extract the plant height, the soil points must bedistinguished from those corresponding to the plants. We basethe segmentation on RGB data using the excess green indexintroduced in [6] to determine an intensity value (I) for eachpoint i in the point cloud.

I(i) = 2G(i)−R(i)−B(i)

In the resulting intensity point cloud, green plants have ahigh intensity value in contrast to a low value for the back-ground which includes the soil surface, shadows, stones andother debris. Once this intensity value, corresponding to thelikelihood of a pixel being a plant pixel, is determined, we useOtsu’s method [7] to determine a global threshold and extracta binary point cloud from the coloured one. This method hasthe advantage of being fully automated while giving accuratesegmentation for images and point clouds collected undervarying illumination conditions and with different cameras asobserved in Figure 4. An example of such a segmented pointcloud is shown in Figure 3.

C. Ground level and Plant Height Estimation

Our pipeline can handle a 3D reconstruction generatedpurely from image data, which usually lacks information aboutthe global orientation of the scene. This can be the case ifno GPS or other geolocation data was recorded during dataacquisition. In particular, the global orientation of the scenedoes not need to be known a priori.

The first step, then, is to determine the global orientationof the scene with respect to the local gravity vector. In

Fig. 3. Segmented point cloud generated using automated thresholding basedon the excess green index and Otsu’s method. Green points represent winterwheat and pale brown represents the soil (ground).

Fig. 4. Image overlaying the automatically segmented vegetation (brightgreen) onto the original image

order to recover this information a linear regression surfaceis fitted through all vertices corresponding to the groundpoints. This surface is further used as an approximation for themean ground level. A point on this surface with coordinates(xA yA zA)

T in the original frame of coordinates A isdescribed via the affine model zA = β0 + β1 · xA + β2 · yA.Here ~β = (β0 β1 β2)

T is constant and minimizes the sumof squared residuals between the vertices and the regressionsurface while β0 describes the offset from the origin. Theresulting approximation is shown in Figure 5. All vertices ofthe point cloud are expressed in a new frame of coordinatesB via the transformation: xB

yBzB

= RAB

xAyA

zA − β0

where RAB is the following rotation matrix:

1+β22

1+β21+β

22

−β1β2√(1+β2

1+β22)(1+β

22)

β1√(1+β2

1+β22)(1+β

22)

0 1√1+β2

2

β2√1+β2

2−β1√

1+β21+β

22

−β2√1+β2

1+β22

1√1+β2

1+β22

The new frame of coordinates is chosen such that the xB-

yB-plane corresponds to the mean ground level. It is a prioriunknown whether the zB-axis points upwards or downwards.

On the test field the winter wheat was arranged withinrectangular plots of approximately 1.50m × 1.00m in size,each of which contained one particular genotype. In order toanalyze separate plots, first, a coarse tessellation is conducted,i.e. the point cloud is subdivided into smaller ones, each ofwhich contains vertices corresponding to one plot.

Page 3: Beyond Point Clouds - 3D Mapping and Field Parameter ...flourish-project.eu/fileadmin/user_upload/publications/khanna2015... · Beyond Point Clouds - 3D Mapping and Field Parameter

Fig. 5. Linear approximation of the ground surface to determine a scene’sglobal orientation

Fig. 6. Plot of height distribution of the plant points. The percentiles from 0to 100 are plotted versus their corresponding height values. 99% of the plantpoints are below the height marked by the red line . The top 1% of the pointsare excluded to account for the possibility of outliers.

A second finer tessellation is then performed to subdivideeach plot into smaller rectangles of size 10 cm × 10 cm.These dimensions are chosen to approximate a ground area ofthe order of individual plants while containing enough greenpoints to be representative of the real world geometry. Theplant height can then be calculated as the distance betweenthe upper vertices within a small tile and the ground.

Since all plants are potentially surrounded by outlier points,it is not desirable to consider the point with the maximalzB coordinate. An analysis of different cumulative histogramswith respect to the point heights showed that typically 1 % ofthe points could be considered as outliers. Thus, in order todecrease the influence of outliers, the 99th height percentilewas chosen to represent the top leaf. The cumulative heightdistribution for one arbitrarily chosen plot is shown in Figure6.

Additionally, since the soil is not perfectly flat, a moredetailed representation of the soil geometry is needed, ratherthan simply calculating the distance between the top leaf andthe xB-yB-plane. Ideally, this representation would be basedon the soil vertices directly below the plants. Unfortunatelythese are typically very sparse, since the above plants coverthe view for the camera of the UAV. In order to model

Fig. 7. Local regression surface along with the corresponding ground pointsfrom the point cloud for one plot. One can observe the large variation in zcoordinates of the ground points from the reconstruction creating the need forlocal regression surfaces for accurate height estimation.

the ground surface based on the vertices within a plant’sneighbourhood, a multinomial regression for the points withineach plot is carried out. As a first approach a simple sec-ond order two-dimensional polynomial function of the formp(x, y) = β0 + β1 · x+ β2 · y+ β3 · x2 + β4 · x · y+ β5 · y2 isfitted to the vertices. For this purpose, given the coordinates(xB,i yB,i zB,i)

T for all points i, a matrix of the followingform is generated:

X =

1 xB,1 yB,1 x2B,1 xB,1 · yB,1 y2B,11 xB,2 yB,2 x2B,2 xB,2 · yB,2 y2B,2

...1 xB,n yB,n x2B,n xB,n · yB,n y2B,n

Here n denotes the number of points within the observed

plot. The least-squares solution to the problem is found as

~β = (β0, . . . , β5)T = (XTX)−1XT~zB

, where ~zB = (zB,1 . . . zB,n)T contains the height values

to be approximated. Figure 7 shows one obtained regressionsurface. The method used above can easily be extended toother functions (s.a. higher order polynomials) by includingthe corresponding columns to the matrix X . The plant heightis finally calculated as h = zB,99 − p(xB,99, yB,99) using thecoordinates of the 99th percentile.

IV. ASSESSMENT OF HEIGHT ESTIMATES

The height estimates from the above described pipeline werecompared with yardstick measurements taken right after theUAV flight. Ten plots were considered for a first evaluation.Five yardstick measurements were taken per plot, four near thecorners and one in the middle. Mean and standard deviationwere calculated for each plot based on the five values.

These were then contrasted with the mean heights andstandard deviations for the same ten plots computed withthe method described above. However, in order to increaseaccuracy for this comparison, not all plants were taken intoaccount for computing the means and standard deviations.

Page 4: Beyond Point Clouds - 3D Mapping and Field Parameter ...flourish-project.eu/fileadmin/user_upload/publications/khanna2015... · Beyond Point Clouds - 3D Mapping and Field Parameter

1 2 3 4 5 6 7 8 9 10Plot index

0

5

10

15

20

25

Heigh

t [cm

]EstimateGround truth

Fig. 8. Comparison of the average height estimates per plot with yardstickmeasurements for ten different plots. The error bars represent a deviation ofone standard error from the mean values.

Instead, just like for the yardstick measurements, five plantheight estimates per plot were collected, four from the cornersand one from the middle. These were chosen to be the heightmaxima within certain neighbourhoods around the cornersand the middle respectively. This setup aims at comparingthe yardstick measurements with their actual estimated coun-terparts.Certainly these estimates depend on the size of thereference neighbourhood. Taking the maximum height out ofnine plants, i.e. considering a 3 × 3 neighbourhood, led toan underestimation of the plant height. However, for 5 × 5plant neighbourhoods, in eight out of ten plots the groundtruth means were contained in the confidence intervals of theestimation (and vice versa), as shown in Figure 8.

V. CONCLUSION AND FUTURE WORK

This paper outlines a framework for estimating crop heightusing data collected from a setup consisting of a readilyavailable monocular camera and UAV. Algorithms for auto-mated post processing of the point clouds produced usingcommercially available photogrammetry software into cropheight estimates are presented. The segmentation algorithmswhich distinguish plants from the soil are inherently indexagnostic, and future work could examine the efficacy of dif-ferent indices for soil-plant segmentation. After segmentationand alignment, average crop heights for customizable plotsizes are automatically estimated and presented in the formof user readable maps, such as the one in Figure 9, useful foragronomists, farmers and crop scientists. Results are shown tocompare favourably with manual measurements.

Future work will focus on estimation of other essentialcrop parameters such as canopy cover, above ground biomassand investigate the use of multipspectral indices such asNormalised Difference Vegetation Index for soil-plant segmen-tation and crop health indication.

REFERENCES

[1] Point clouds generated using Pix4Dmapper by Pix4D. [Online].Available: http://www.pix4d.com/

[2] A. Walter, B. Studer, and R. Kolliker, “Advanced phenotyping offersopportunities for improved breeding of forage and turf species,” Annalsof botany, p. mcs026, 2012.

[3] A. Walter, F. Liebisch, and A. Hund, “Plant phenotyping: from beanweighing to image analysis,” Plant methods, vol. 11, no. 1, p. 14, 2015.

Fig. 9. User readable height map generated using our pipeline, making iteasier for agronomists and farmers to take informed decisions during farmmanagement or rating breeds during experiments.

[4] D. Anthony, S. Elbaum, A. Lorenz, and C. Detweiler, “On crop heightestimation with uavs,” in Intelligent Robots and Systems (IROS 2014),2014 IEEE/RSJ International Conference on. IEEE, 2014, pp. 4805–4812.

[5] G. R. T. C. J. Dong, L. Carlone and F. Dellaert, “4d mapping of fieldsusing autonomous ground and aerial vehicles,” in CSBE/ASABE JointMeeting Presentation, 2014, 2014.

[6] A. A. Gitelson, “Wide dynamic range vegetation index for remotequantification of biophysical characteristics of vegetation,” Journal ofplant physiology, vol. 161, no. 2, pp. 165–173, 2004.

[7] N. Otsu, “A threshold selection method from gray-level histograms,”Automatica, vol. 11, no. 285-296, pp. 23–27, 1975.