extrinsic and temporal calibration of automotive radar and

8
Extrinsic and Temporal Calibration of Automotive Radar and 3D LiDAR Chia-Le Lee*, Yu-Han Hsueh*, Chieh-Chih Wang, Wen-Chieh Lin Abstract— While automotive radars are widely used in most assisted and autonomous driving systems, only a few works were proposed to tackle the calibration problems of automotive radars with other perception sensors. One of the key calibration challenges of automotive planar radars with other sensors is the missing elevation angle in 3D space. In this paper, extrinsic calibration is accomplished based on the observation that the radar cross section (RCS) measurements have different value distributions across radar’s vertical field of view. An approach to accurately and efficiently estimate the time delay between radars and LiDARs based on spatial-temporal relationships of calibration target positions is proposed. In addition, a localiza- tion method for calibration target detection and localization in pre-built maps is proposed to tackle insufficient LiDAR measurements on calibration targets. The experimental results show the feasibility and effectiveness of the proposed Radar- LiDAR extrinsic and temporal calibration approaches. Index Terms—Radar, LiDAR, Extrinsic Calibration, Tempo- ral Calibration. I. I NTRODUCTION Sensor fusion is essential and critical to enhance and ensure the performance and reliability of the machine percep- tion modules in assisted and autonomous driving. Both of- fline and online calibrations are the most critical procedures for sensor fusion. There have been a number of approaches and systems for accomplishing calibration between cameras, LiDARs and IMUs [1]–[7]. While automotive mmWave radars are widely used in assisted and autonomous driving, only a few works address calibration between radars and other perception sensors [8]–[13]. In this paper, we focus on both extrinsic and temporal calibration between a planar automotive radar and a 3D LiDAR. Fig. 1 shows the test vehicle with the sensors used in this work. One of the most challenging issues in this extrinsic cal- ibration setting is that planar radars only offer two dimen- sional geometric measurements and one dimensional radial velocity measurements but a rigid transformation between the coordinate systems of the planar radar and the 3D LiDAR must be estimated. To solve this issue, Perˇ si´ c et al. [12], *Authors have contributed equally and names are in alphabetical order. Chia-Le Lee is with the Institute of Electrical and Computer En- gineering, National Chiao Tung University, Hsinchu, Taiwan. E-mail: [email protected] Yu-Han Hsueh is with the Institute of Electrical and Control En- gineering, National Chiao Tung University, Hsinchu, Taiwan. E-mail: [email protected] Chieh-Chih Wang is with the Department of Electrical and Computer Engineering, National Chiao Tung University, and with the Mechanical and Mechatronics Systems Research Laboratories, Industrial Technology Research Institute, Hsinchu, Taiwan. E-mail: [email protected] Wen-Chieh Lin is with the Department of Computer Science, National Chiao Tung University, Hsinchu, Taiwan. E-mail: [email protected] Fig. 1. Sensors on the test vehicle. One Velodyne HDL-32E LiDAR and three Delphi ESR radars are used in this work. Fig. 2. The mismatch problem between radar measurements (red spheres) and LiDAR measurements (black points) due to the time delay. [13] proposed to use radar echo intensities, i.e., radar cross section (RCS) values to infer the missing dimension of the planar radar in which the transformations between a radar, a LiDAR and a camera could be estimated. As their data collection process is proceeded by driving a robot in front of the targets, and acquire data under the condition that the sensors and the target are both stationary, they are able to avoid the influence from the time delay between radar and LiDAR. However, temporal calibration is also essential and important for sensor fusion. Fig. 2 shows a mismatch of the radar measurements and the LiDAR measurements due to the time delay between the two sensors despite good extrinsic calibration. Most of the recent works on temporal calibration between cameras, LiDAR or IMU are trajectory-alignment based ap- proaches [14]–[18]. Following the same trajectory-alignment principles, we present an approach to accomplish temporal calibration between automotive radar and 3D LiDAR. While the motion and the trajectory estimation using radar are less accurate compared to cameras and IMU, a target-based approach is proposed. Due to the uncertainty of radar, rapid 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) October 25-29, 2020, Las Vegas, NV, USA (Virtual) 978-1-7281-6211-9/20/$31.00 ©2020 IEEE 9976

Upload: others

Post on 22-Jan-2022

6 views

Category:

Documents


0 download

TRANSCRIPT

Extrinsic and Temporal Calibration ofAutomotive Radar and 3D LiDAR

Chia-Le Lee*, Yu-Han Hsueh*, Chieh-Chih Wang, Wen-Chieh Lin

Abstract— While automotive radars are widely used in mostassisted and autonomous driving systems, only a few workswere proposed to tackle the calibration problems of automotiveradars with other perception sensors. One of the key calibrationchallenges of automotive planar radars with other sensors isthe missing elevation angle in 3D space. In this paper, extrinsiccalibration is accomplished based on the observation that theradar cross section (RCS) measurements have different valuedistributions across radar’s vertical field of view. An approachto accurately and efficiently estimate the time delay betweenradars and LiDARs based on spatial-temporal relationships ofcalibration target positions is proposed. In addition, a localiza-tion method for calibration target detection and localizationin pre-built maps is proposed to tackle insufficient LiDARmeasurements on calibration targets. The experimental resultsshow the feasibility and effectiveness of the proposed Radar-LiDAR extrinsic and temporal calibration approaches.

Index Terms— Radar, LiDAR, Extrinsic Calibration, Tempo-ral Calibration.

I. INTRODUCTION

Sensor fusion is essential and critical to enhance andensure the performance and reliability of the machine percep-tion modules in assisted and autonomous driving. Both of-fline and online calibrations are the most critical proceduresfor sensor fusion. There have been a number of approachesand systems for accomplishing calibration between cameras,LiDARs and IMUs [1]–[7]. While automotive mmWaveradars are widely used in assisted and autonomous driving,only a few works address calibration between radars andother perception sensors [8]–[13]. In this paper, we focuson both extrinsic and temporal calibration between a planarautomotive radar and a 3D LiDAR. Fig. 1 shows the testvehicle with the sensors used in this work.

One of the most challenging issues in this extrinsic cal-ibration setting is that planar radars only offer two dimen-sional geometric measurements and one dimensional radialvelocity measurements but a rigid transformation between thecoordinate systems of the planar radar and the 3D LiDARmust be estimated. To solve this issue, Persic et al. [12],

*Authors have contributed equally and names are in alphabetical order.Chia-Le Lee is with the Institute of Electrical and Computer En-

gineering, National Chiao Tung University, Hsinchu, Taiwan. E-mail:[email protected]

Yu-Han Hsueh is with the Institute of Electrical and Control En-gineering, National Chiao Tung University, Hsinchu, Taiwan. E-mail:[email protected]

Chieh-Chih Wang is with the Department of Electrical and ComputerEngineering, National Chiao Tung University, and with the Mechanicaland Mechatronics Systems Research Laboratories, Industrial TechnologyResearch Institute, Hsinchu, Taiwan. E-mail: [email protected]

Wen-Chieh Lin is with the Department of Computer Science, NationalChiao Tung University, Hsinchu, Taiwan. E-mail: [email protected]

Fig. 1. Sensors on the test vehicle. One Velodyne HDL-32E LiDAR andthree Delphi ESR radars are used in this work.

Fig. 2. The mismatch problem between radar measurements (red spheres)and LiDAR measurements (black points) due to the time delay.

[13] proposed to use radar echo intensities, i.e., radar crosssection (RCS) values to infer the missing dimension of theplanar radar in which the transformations between a radar,a LiDAR and a camera could be estimated. As their datacollection process is proceeded by driving a robot in frontof the targets, and acquire data under the condition that thesensors and the target are both stationary, they are able toavoid the influence from the time delay between radar andLiDAR. However, temporal calibration is also essential andimportant for sensor fusion. Fig. 2 shows a mismatch of theradar measurements and the LiDAR measurements due to thetime delay between the two sensors despite good extrinsiccalibration.

Most of the recent works on temporal calibration betweencameras, LiDAR or IMU are trajectory-alignment based ap-proaches [14]–[18]. Following the same trajectory-alignmentprinciples, we present an approach to accomplish temporalcalibration between automotive radar and 3D LiDAR. Whilethe motion and the trajectory estimation using radar areless accurate compared to cameras and IMU, a target-basedapproach is proposed. Due to the uncertainty of radar, rapid

2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)October 25-29, 2020, Las Vegas, NV, USA (Virtual)

978-1-7281-6211-9/20/$31.00 ©2020 IEEE 9976

(a) extrinsic calibration. (b) temporal calibration.

Fig. 3. The targets are placed in an open area.

motions are preferred during data collection for ensuringthe time delay’s observability. Since the time delay effectwould be magnified at the long range objects, the targetsare placed both at short range and long range as shownin Fig. 3(b). While radar can perceive the target at longrange, LiDAR is unable to detect the target owing to sparseLiDAR measurements. Hence, a target localization approachis proposed for dealing with this issue.

The contributions of this paper are that: (1) A temporalcalibration procedure is proposed to estimate the time de-lay accurately and efficiently in which the required sensormotion is analyzed. (2) Calibration target detection andlocalization in pre-built maps are applied. (3) This is thefirst work to address both extrinsic and temporal calibrationproblems of automotive radar and LiDAR to the best of ourknowledge.

II. ESTABLISHING CORRESPONDENCE

To calibrate radars and LiDARs, target that can induce re-liable, distinguishable, and sensitive measurements from thetwo sensors is desirable. We adopt the same calibration targetdesigned in [12], [13], which includes a corner reflector anda triangular styrofoam board as shown in Fig. 4. For radars,corner reflector’s reflective characteristics ensure that it iseasily detected; for LiDARs, a triangular styrofoam board isused to estimate the pose of the target. Since the styrofoammaterial is invisible to the radar, the corner reflector isplaced behind the triangular styrofoam board as shown inFig. 4(b) in which the radar and the LiDAR can perceivethe target without mutual interference. The centers of thecorner reflector and the triangular styrofoam board are wellaligned and assumed at the same location. Accordingly, thetarget centers from radar and LiDAR measurements can beestimated and used to establish correspondences.

A. Estimating the target center from the radar measurements

A radar measurement includes the range to the centerrrr of the object and its azimuth angle rφr, where theleft superscript represents it resides at the radar frame Fr;the right subscript means the measurement is derived fromthe radar. Besides, the RCS value δr, which represents thereflection intensity of the object, is also provided by theradar. Since the automotive radars do not provide verticalinformation of the object (i.e. the elevation angle is missing),for each radar measurement, there is a potential arc that theobject can possibly reside in 3D space. When using radars for

(a) Front view (b) Back view

Fig. 4. Calibration target used in this paper: a triangular styrofoam boardfixed in front of a corner reflector as it is invisible to radar.

object detection and tracking, the projected point on the radarplane Pr (of elevation angle equals 0) is often used as theposition of an object. When calibrating an automotive radarwith a 3D LiDAR, the missing elevation angle would cause anon-negligible estimation error on vertical components of 6DoF transformation. Nevertheless, the vertical componentssuch as z-translation, pitch, and roll, can still be refinedby taking RCS values into account. The details will bediscussed in Section III-B. In our calibration setting, targetsare stationary and tracked during the data collection process.At time ti the measurement of the k-th target: rsrk(ti) =[rrrk(ti)

rφrk(ti) ] with the RCS value δrk(ti) can becollected for extrinsic and temporal calibration.

B. Estimating the target center from the LiDAR measure-ments

In contrast to radar’s sparse measurements, LiDAR canprovide dense 3D point measurements. However, the LiDARmeasurements could still become sparse for objects locatedat long range, resulting in the decrease of the accuracy intarget detection and pose estimation. In this work, the targetcenter is estimated via LiDAR localization given the targetpose with respect to the pre-built map rather than detectingthe target center directly. By doing so, the target center canbe estimated even it is far away from the LiDAR and theoverlap of FoV between radar and LiDAR is not necessary.In the data collection process, it is only necessary to ensurethe targets can be perceived by radar. The approach is dividedinto two steps:

1) Mapping and localization: By using the scan matchingalgorithm, e.g., Iterative Closest Point (ICP) [19] or NormalDistribution Transform (NDT) [20], a point cloud map of theenvironment with multiple calibration targets is built. Afterlocalization is done, the transformation l

Tm(tj) from themap frame Fm to the LiDAR frame Fl at time tj can becomputed.

2) Estimation of the target center: The target centersat each LiDAR scan are estimated by transforming targetcenters from the map frame Fm to the LiDAR frame Fl.To obtain the target centers in the map, a triangle model isdefined and optimized to fit the edge points of the triangleboard. In consequence, the pose of the target center isobtained. We denote the position of the k-th target in themap as mplk, and the target center in the LiDAR frame at

9977

time tj can be estimated by the following equation:lplk(tj) = l

Tm(tj) ·m plk (1)

C. Correspondence establishment

In order to associate the data from the two sensors, the tar-get centers from the LiDAR measurements are transformedinto the radar frame by an initial approximate transformationrTl. Since different parameterization choices would cause

different uncertainty on the refined parameters as elabo-rated in [13], the recommended parameterization is adopted.The transformation rTl from Fl to Fr includes two parts:[rtl

lΘr], where rtl = [tx ty tz] and lΘr = [θz θy θx]denote the translation part from Fl to Fr, and rotation partfrom Fr to Fl. The Euler angle representation is used in therotation part lΘr, where θz, θy, θx are yaw, pitch and roll,and the rotation matrix is given by:

R(lΘr) = Rx(θz)Ry(θy)Rz(θx) (2)

The target center lplk(tj) estimated from the LiDAR mea-surements is transformed to the radar frame as rplk(tj):

rplk(tj) = rTl · lplk(tj) (3)

The radar measurements near rplk(tj) within a prede-fined distance threshold are checked and the closest oneis chosen as the target center estimation from radar. As aresult, the radar-LiDAR correspondence is established. Asthe target centers from the radar measurements are describedin the spherical coordinate system, rplk(tj) is convertedto the representation in spherical coordinate as rslk(tj) =[rrlk(tj)

rφlk(tj)rψlk(tj)] for reprojection process in Sec-

tion III-A and temporal calibration in Section IV. In addition,since the timestamps of radar measurements and LiDARmeasurements are discrete and different, linear interpolation[21] is used to keep the correspondences built in the sametimestamps. It should be noted that uncertainties in the mapand the localization module could degrade the performanceof target localization. The accuracy of the map and thelocalization approaches should be analyzed and could befurther improved to obtain better calibration results. Once thecorrespondences between Radar and LiDAR measurementsare established, extrinsic and temporal calibration could beaccomplished using these correspondences.

III. EXTRINSIC CALIBRATION

In this work, it is assumed that intrinsic calibrations ofradar and LiDAR are achieved by the manufacturers, sowe only need to focus on extrinsic and temporal calibrationbetween radar and LiDAR. Following the work [12], [13],the extrinsic calibration can be formulated as a two-stageoptimization problem including reprojection error optimiza-tion and RCS error optimization. In the reprojection erroroptimization, our method follows [12], [13]. Regarding theRCS error optimization, [13] represents the RCS-elevationdependence as a second-order polynomial and optimizes thesquared distance between the measured RCS and expectedRCS.

Our method proposes to use subsets of the correspon-dences along the elevation central and boundary planes thatare determined with RCS values to optimize the squareddistance between transformed LiDAR data and the corre-sponding plane. The details will be described in this section.

A. Reprojection Error Optimization

For readability, the reprojection error optimization isbriefly described. In this stage, the current estimated extrinsicparameters are represented as a six degree-of-freedom (DoF)parameter set cp = [rtl

lΘr]. Accordingly, the LiDARdata rslk(ti) = [rrlk(ti)

rφlk(ti)rψlk(ti)] for each cor-

respondence can be obtained, and projected into the radarplane by ignoring the elevation angle rψl,i. Therefore, thereprojection error term is defined as the distance betweenprojected LiDAR data and radar data on the radar plane inwhich rψlk(ti) =r ψrk(ti) = 0◦ :

εik(cp) = ‖pr − pl‖ (4)

where

pr =

[rrrk(ti) cos(rφrk(ti))rrrk(ti) sin(rφrk(ti))

], pl =

[rrlk(ti) cos(rφlk(ti))rrlk(ti) sin(rφlk(ti))

]The calibration parameters cp are obtained by minimiz-

ing the cost function using the Levenberg-Marquardt (LM)algorithm implemented by the Ceres Solver [22]:

cp = arg mincp

(

N−1∑i=0

K−1∑k=0

ε2ik), (5)

where the cost function is the sum of the square of repro-jection error across N ×K correspondences.

B. RCS Error Optimization

Once reprojection error optimization is completed, thesix calibration parameters are estimated. However, tz , θy ,θx would have higher uncertainty comparing with otherparameters since the elevation angle is neglected in theprojection process. Consequently, these three parameters arerefined using RCS measurements.

The distribution of the RCS measurements δr’s across 3Dspace has the following characteristics:

1) The radar measurements which are closer to the radarplane (i.e., elevation angle equals to 0 degree) usuallyhave higher RCS values.

2) As the distance between the target and the radar planeis getting larger, the RCS value decreases and usuallyreaches to its minimum at the boundary of radar’svertical FoV.

Based on the characteristics of RCS measurements, thekey assumptions are made in this work:

1) Correspondences with higher δr should be closer tothe radar plane Pr (of elevation angle ψ = 0).

2) Correspondences with lower δr should be closer tothe boundary of the radar’s vertical FoV, i.e., upperboundary plane PU and lower boundary plane PD,

9978

which are determined by the specification of the radarhardware.

Following the assumptions, the correspondences withhigher and lower δr’s are chosen to refine the 3-DoF pa-rameter set crcs = [tz θy θx]. Let δV and δW be the subsetof all correspondences with the highest V% and lowest W%RCS measurement values, respectively. The RCS error termis defined as follows:

εik =

d1, if δrk(ti) ∈ δVd2, if δrk(ti) ∈ δW0, otherwise

, (6)

whered1 = dist(rplk(ti), Pr),

d2 = min{dist(rplk(ti), PU ), dist(rplk(ti), PD)}.

The cost function is defined as the sum of squares of RCSerror from N×K correspondences and the 3-DoF parameterset is obtained by minimizing the cost function using theCeres Solver [22]:

ˆcrcs = arg mincrcs

(

N−1∑i=0

K−1∑k=0

ε2ik) (7)

IV. TEMPORAL CALIBRATION

In this work, the time delay between radar and LiDAR isassumed to be a constant time ∆t and temporal calibrationcan be represented as an estimation problem of the timedelay between radar and LiDAR. We estimate this time delayusing spatial-temporal relationships of target measurements.As the uncertainty of radar measurements would affect theestimation on the time delay, we argue that sufficient motionson the sensor system are required in this section.

A. Time Delay Estimation

Since the time delay is a one-dimensional parameter,sufficient features in one dimension would be good for theoptimization. Thus, aligning the azimuth angle of the targetmeasurements from radar and LiDAR would be sufficientfor estimating the time delay. Specifically, the azimuth errorterm is defined as follows:

εk(ti) = |rφrk(ti)−r φlk(ti − δt)|, (8)

where rφrk(ti) is the azimuth angle of the target from radarmeasurements, and rφlk(ti − δt) is the azimuth angle ofthe target from LiDAR measurements which is transformedinto the radar frame by Eq. (3). δt is the estimated value of∆t. Since the rφlk(t) is discrete, the intermediate poses arecomputed using spherical linear interpolation (SLERP).

The time delay is estimated by minimizing the costfunction using the Levenberg-Marquardt (LM) algorithmimplemented by the Ceres Solver [22]:

δt = arg minδt

(

N−1∑i=0

K−1∑k=0

ε2k(ti)), (9)

where the cost function is defined as the sum of squares ofazimuth error from N ×K correspondences.

Fig. 5. Comparison of LiDAR measurements before and after LiDARmeasurement rectification. (a) The LiDAR scans the same wall twice.The are mismatched wall segments. (b) After rectification, the mismatchedproblem is resolved.

B. Motion Generation and LiDAR Measurement Rectifica-tion

To assess the required sensor motion for temporal cali-bration, we analyze whether the time delay estimation errorwould be influenced by the motion. The azimuth error termis rewritten as follows

εk(ti) = |rφrk(ti)−r φlk(ti − δt)| (10)

= |rφrk(ti)−r φlk(ti − δt)| (11)

= |rφrk(ti)−r εrk −r φlk(ti −∆t)

−r ωlk(ti −∆t) · (∆t− δt)| (12)= |rωlk(ti −∆t) · (δt−∆t)−r εrk|, (13)

where rφrk(ti) is the radar measurement on the azimuthangle without uncertainty, rεrk is the uncertainty of theradar measurement on the azimuth angle, and the motionrωlk(ti) (i.e., angular velocity) is equal to the derivativeof rφlk(ti). Comparing to radar uncertainty, we assumeLiDAR uncertainty is small enough to be neglected. Thus,Eq. (11) could be derived and rφlk(ti) is defined as theLiDAR measurement on the azimuth angle without uncer-tainty. Suppose δt is close to ∆t, the linear approximation ofrφlk(ti−δt) can be used in Eq. (12). By definition, rφrk(ti)equals rφlk(ti − ∆t) and the terms can cancel out eachother. According to Eq. (13), the time delay estimation error(δt−∆t) could be written as

rεrk±εk(ti)rωlk(ti−∆t) which illustrating

(δt −∆t) is inversely proportional to rωlk(ti −∆t). Whenthe rωlk(ti −∆t) becomes larger, the time delay estimationerror would be reduced.

According to the analysis above, rapid motion can bothreduce the the time delay estimation error. Thus, we proposeto generate motions by moving the sensor system includingradar and LiDAR while the targets are stationary. However,LiDAR rectification would be necessary if LiDAR is moved

9979

rapidly. A rectification approach is applied in which theego-motion is estimated by pairwise scan matching andthe transformation of every LiDAR point is estimated byinterpolation under an assumption that the motion betweentwo scans is constant and stable. Fig. 5 shows that theLiDAR measurement errors are significantly reduced and theaccuracy of localization is also improved.

Once temporal calibration is accomplished, the timestampsof radar measurements in Section III are revised, and theerrors caused by the time delay are reduced by iterativelyperforming extrinsic calibration and temporal calibrations.

V. EXPERIMENTS

In this section, the experiment setups and results ofextrinsic and temporal calibration are described in details.

A. Experiment Setups

A sensor rack equipped with a Velodyne HDL-32E 3D-LiDAR and three Delphi ESR V9.21.15 radars is placedon the top of the vehicle as shown in Fig. 1. The LiDARfrequency is 10 Hz and the radar frequency is 20 Hz. Theexperiments were conducted in open space as shown in Fig.3. As Velodyne LiDAR is used in this experiment, it ispossible to configure the cut angle to cut the continuous datastream to form the point cloud. The cut angle is set to be360 degree since LiDAR measurements from all directionsare preferred for LiDAR localization. The calibration targetis a corner reflector with a triangular styrofoam board fixedin front. The center of the reflector and the styrofoam boardare aligned as shown in Fig. 4. A plastic tube and a plasticbase are used as the target stand since plastic materials reflectback signals with weaker radar echo intensity compared withcorner reflector. Therefore, performing RCS thresholding isused to avoid misinterpreting the target stand for the actualtarget.

In the data collection process, the software is run on RobotOperating System (ROS), and the timestamps of all sensormeasurements are generated by the computer’s system clock.The additional delays from communication, the operatingsystem and ROS could occur. We assume those delays aresummed to be a constant and the uses of the timestampsfrom sensor clocks to further improve temporal calibrationperformance would be our future work.

1) Extrinsic Calibration Data Collection Procedures:To ensure the accuracy of extrinsic calibration, the targetsare placed at 7 different heights and adjusted to pointtowards radar at the beginning. To alleviate the effort in datacollection, the data were collected continuously by drivingthe vehicle away from the targets up to 10 m in differentmoving directions as shown in Fig. 3(a). As radar’s azimuthaccuracy would induce larger reprojection error at long-rangeand would cause more uncertainty on x-translation and y-translation, the calibration targets were not located more than10 m away. While the calibration performance depends onthe quality of RCS measurements, the extrinsic calibrationresults following the proposed data collection proceduresshow its effectiveness in Section V-B as the collected data

Fig. 6. The box plot shows the time delay estimation errors for differentangular velocities in simulation. The result shows that the error mediandecreases and the distribution becomes more certain with higher angularvelocities.

Fig. 7. The box plot show the time delay estimation errors from differenttime periods in simulation. The result shows that the error median decreasesand the distribution becomes more certain with longer time period ofmeasurements.

still follows the RCS distribution rules. In addition, thevehicle was driven slowly to reduce the influence of thetime delay between the radars and the LiDAR and the errorcaused by the time delay can be further compensated via theproposed temporal calibration procedures.

2) Temporal Calibration Data Collection Procedures:Two calibration targets were placed at 9.44 meter and 20.12meter away from the sensor system as shown in Fig. 3(b).As the radar and LiDAR were mounted firmly on the sensorrack, sufficient motions can be easily generated by movingthe sensor rack directly.

To assess the motion required for temporal calibration, twosimulations are performed. In the simulations, the data weregenerated based on the specifications of radar and LiDAR,and our experiment setup. Both simulations were run 1000times. In the first simulation, the target trajectories weregenerated by different angular velocities of the sensor rack ina fixed time period, 30 seconds. In the second simulation, thetarget trajectories were generated by different time periods

9980

Fig. 8. RCS distribution across transformed LiDAR data in radar’s FoV. TheLiDAR data is colored based on the RCS value of paired radar measurement.Solid line: Radar plane; Dashed line: Boundary of radar’s vertical FoV.

with a fixed angular velocity, 0.3 rad/s. Fig. 6 and Fig. 7 showthat higher angular velocity and longer time period wouldreduce the time delay estimation error in which our inferencein Sec. IV-B is confirmed. According to the simulation result,we kept the angular velocity over one radian per second Thetime for data collection is around 30 seconds. The motion ofthe sensor rack was generated by that two people hold eachside of the sensor rack and rotate the sensor rack as stableand rapid as possible. For obtaining the motion ground truthand improving motion generation, a mechanical device suchas a vehicle turntable would be a more desirable choice.

B. Results

1) Extrinsic Calibration Results: In order to verify theperformance of the estimated extrinsic calibration parame-ters, three analyses were conducted.

First, the two-stage optimization is accomplished and thecalibration results for reprojection error optimization cp, RCSerror optimization ˆcrcs, carefully tape measured translationrtl are listed below:

• cp = [0.16 m, 0.11 m, 0.07 m, 35.1°,−0.4°,−5.0°]• ˆcrcs = [0.26 m, 2.6°, −0.6°]• rtl = [0.15 m, 0.1 m, 0.27 m]

We can observe that there are slight differences betweenestimated parameters in translation and tape measurements.The differences could originate from the imprecision intape measurements and the limits of accuracy in radarmeasurement (±0.25 m for range accuracy; ±1◦ for az-imuth accuracy [23]). After reprojection error optimization,the average reprojection error, i.e., the average Euclideandistance between radar data and projected LiDAR data isabout 0.074 m, which is below the range accuracy of Delphiradar.

Second, to examine the RCS distribution in radar’s verticalFoV, LiDAR measurements are transformed into the radarframe by applying the estimated extrinsic parameters. Thespatial distribution of LiDAR data in the lateral view isplotted in Fig. 8, where LiDAR measurements are coloredaccording to the RCS values of paired radar measurements.We can observe that RCS value decreases as the distance

Fig. 9. RCS distribution with respect to the pitch angle. Correspondenceswith higher RCS value (above the red line) and with lower RCS value (belowthe purple line) are used for optimization. Green points: After reprojectionerror optimization; Blue points: After RCS error Optimization; Black dashedline: The boundary of vertical FoV.

Reprojection Error Optimiza-tion

RCS Error Optimization

tx[m] N(0.158, 2.417 × 10−4) —ty [m] N(0.110, 4.917 × 10−4) —tz [m] N(0.116, 1.141 × 10−1) N(0.262, 4.198 × 10−5)

θz [°] N(35.699, 4.208 × 10−2) —θy [°] N(−0.744, 7.590 × 10−0) N(2.577, 2.424 × 10−3)

θx[°] N(−4.282, 1.319 × 10−0) N(−0.580, 1.265 × 10−4)

TABLE ICALIBRATION RESULT OF TWO-STAGE OPTIMIZATION

between the target measurement and the radar plane isgetting larger, and reach its minimum at the boundary planes.The boundary planes are the planes with +4.5◦ and −4.5◦

elevation angle, which are determined by the specificationof Delphi radar [23]. Fig, 9 shows the relation betweenRCS value and the elevation angle of transformed LiDARdata. After reprojection error optimization, plenty of targetmeasurements are still located outside radar’s vertical FoV.On the other hand, the distribution is significantly improvedafter RCS error optimization.

Finally, we performed Monte Carlo experiments for 1000runs by randomly sub-sampling all the correspondences intohalf of the original number. Compared with other parameters,Table I shows that tz , θy , θx have larger variances andrequired to be refined in RCS error optimization. Fig. 10presents the distribution of tz , θy , θx after two optimizations.After RCS error optimization, the three parameters’ variancesare significantly reduced, the mean values are changed aswell. The mean of tz estimation become closer to tapemeasured value.

For the purpose of comparison, the method suggested in[13] which resembles RCS-elevation relation as one parabolacurve has been implemented, labelled as method RCS. Theresults of our proposed method and Method RCS are shownin Table II. While there could be no significant differencesbetween the proposed method and Method RCS, we argue

9981

Fig. 10. Monte Carlo Analysis: the distribution of estimated extrinsicparameters. Blue: After Reprojection Error Optimization; Red: After RCSError Optimization.

Proposed method RCS [13]

tz [m] N(0.262, 4.198 × 10−5) N(0.237, 7.711 × 10−7)

θy [°] N(2.577, 2.424 × 10−3) N(2.703, 2.176 × 10−5)

θx[°] N(−0.580, 1.265 × 10−4) N(−0.833, 1.345e× 10−4)

TABLE IIMONTE CARLO ANALYSIS RESULTS FOR RCS ERROR OPTIMIZATION

Fig. 11. Results of error calculation between three radars. Red line: Theleft radar; Blue line: The middle radar; Green line: The right radar.

that the RCS distribution may not be well represented by aparabola curve based the observations on the collected radarmeasurements.

2) Temporal Calibration Results: Fig. 11 shows that thetime delays from three radars are around 0.16 second. Aftertemporal calibration, the mean-square error of azimuth anglehas 60% to 90% decrease. The total number of samples isaround 2000 radar measurements in which the reliability ofthe approach is validated.

After the time delay is estimated, the timestamps of theradar measurements can be revised and be examined asshown in Fig. 13 in which the initial and modified alignments

(a) Top view

(b) Side view

Fig. 12. Comparison of before and after temporal calibration. Red point:Before temporal calibration; Green point: After temporal calibration.

of the estimated target azimuth angles from the radars andthe LiDAR are shown in which the temporal calibrationperformance is demonstrated. In addition, we compare radarand LiDAR measurements in the same frame to verify theeffectiveness of temporal calibration. After temporal calibra-tion, the average distance between radar data and LiDARdata decreases from 1.78 meter to 0.94 meter for the neartarget, and from 3.8 meter to 2.01 meter for the farther target.Fig. 12 illustrates that the radar measurements are close toLiDAR measurements after temporal calibration.

VI. CONCLUSION AND FUTURE WORK

The approaches to estimate the extrinsic parameters andthe time delay between planar radars and a 3D-LiDAR havebeen presented. Based on the characteristics of planar radarRCS distribution in 3D space, the extrinsic parameters areestimated and used for temporal calibration. A target-basedapproach to estimate the time delay between planar radar and3D LiDAR has been demonstrated and verified. To the bestof our knowledge, our approach is the first one to accomplishboth extrinsic and temporal calibrations of automotive radarsand LiDARs.

The experiment results show there is small time delaydrift for three radars. It would be our plan to perform moreexperiments to analyze the influence factors of time delayestimation. As sufficient motions during data collection arerequired for temporal calibration but these motions couldmake extrinsic calibration less reliable. In this work, extrinsicand temporal calibrations are not performed simultaneously.Further improvements on optimization and integration ofboth calibration tasks would be our future work.

ACKNOWLEDGEMENT

This work was supported in part by Industrial TechnologyResearch Institute (ITRI), Ministry of Economic Affairs(MOEA) in Taiwan, and by grants from Ministry of Scienceand Technology (MOST) in Taiwan (#109-2221-E-009-119-MY3, #109-2221-E-009-120-MY3).

9982

(a) (b)

(c) (d)

Fig. 13. Comparison of target azimuth angle alignments. The black curve is estimated by LiDAR, The red, green and blue points are estimated bythe three on-board automotive radars respectively. The top row is from near target and the bottom row is from farther target. The left column shows thealignments mismatching before temporal calibration. The right column shows the azimuth angle measurements from each sensor matching together aftertemporal calibration.

REFERENCES

[1] C. Le Gentil, T. Vidal-Calleja, and S. Huang, “3D Lidar-IMU Cali-bration based on Upsampled Preintegrated Measurements for MotionDistortion Correction,” in Proceedings of IEEE International Confer-ence on Robotics and Automation (ICRA), 2018.

[2] J. Castorena, U. S. Kamilov, and P. T. Boufounos, “Auto calibrationof LiDAR and optical cameras via edge alignment,” in Proc. IEEEInt. Conf. Acoust. Speech Signal Process. (ICASSP), March, 2016, pp.2862–2866.

[3] J. Levinson and S. Thrun, “Automatic Online Calibration of Camerasand Lasers,” in Proceedings of the Robotics: Science and Systems,2013.

[4] J. Jeong, L. Y. Cho, and A. Kim, “Road is Enough! ExtrinsicCalibration of Non-overlapping Stereo Camera and LiDAR using RoadInformation,” arXiv preprint arXiv:1902.10586, 2019.

[5] Y. Park, S. Yun, C. S. Won, K. Cho, K. Um, and S. Sim, “Calibrationbetween Color Camera and 3D LIDAR Instruments with a PolygonalPlanar Board,” Sensors, vol. 14, no. 3, pp. 5333–5353, 2014.

[6] A. Dhall, K. Chelani, V. Radhakrishnan, and K. M. Krishna, “LiDAR-Camera Calibration using 3D-3D Point correspondences,” arXivpreprint arXiv:1705.09785, 2017.

[7] C. Guindel, J. Beltran, D. Martın, and F. Garcıa, “Automatic ExtrinsicCalibration for Lidar-Stereo Vehicle Sensor Setups,” in IEEE Interna-tional Conference on Intelligent Transportation Systems (ITSC), 2017.

[8] T. Wang, N. Zheng, J. Xin, and Z. Ma, “Integrating millimeter waveradar with a monocular vision sensor for on-road obstacle detectionapplications,” Sensors, vol. 11, no. 9, pp. 8992–9008, 2011.

[9] J. Oh, K.-S. Kim, M. Park, and S. Kim, “A Comparative Study onCamera-Radar Calibration Methods,” in International Conference onControl, Automation, Robotics and Vision (ICARCV), 2018.

[10] C. Scholler, M. Schnettler, A. Krammer, G. Hinz, M. Bakovic,M. Guzet, and A. Knoll, “Targetless Rotational Auto-Calibration ofRadar and Camera for Intelligent Transportation Systems,” arXivpreprint arXiv:1904.08743, 2019.

[11] S. Sugimoto, H. Tateda, H. Takahashi, and M. Okutomi, “ObstacleDetection Using Millimeter-wave Radar and Its Visualization on ImageSequence,” in Proceedings of the 17th International Conference onPattern Recognition (ICPR), 2004, pp. 342–345.

[12] J. Persic, I. Markovic, and I. Petrovic, “Extrinsic 6Dof Calibration of3D LiDAR and Radar,” in European Conference on Mobile Robotics(ECMR), 2017, pp. 165–170.

[13] J. Persic, I. Markovic, and I. Petrovic, “Extrinsic 6dof calibration of aradar–lidar–camera system enhanced by radar cross section estimatesevaluation,” Robotics and Autonomous Systems, vol. 114, pp. 217–230,2019.

[14] F. M. Mirzaei and S. I. Roumeliotis, “A Kalman Filter-Based Al-gorithm for IMU-Camera Calibration: Observability Analysis andPerformance Evaluation,” IEEE Transactions on Robotics, vol. 24,no. 5, pp. 1143–1156, 2008.

[15] M. Li and A. I. Mourikis, “3-D motion estimation and online temporalcalibration for camera-IMU systems,” in IEEE International Confer-ence on Robotics and Automation, 2013, pp. 5709–5716.

[16] M. Fleps, E. Mair, O. Ruepp, M. Suppa, and D. Burschka, “Opti-mization based IMU camera calibration,” in IEEE/RSJ InternationalConference on Intelligent Robots and Systems, 2011, pp. 3297–3304.

[17] J. Kelly and G. S. Sukhatme, “A General Framework for TemporalCalibration of Multiple Proprioceptive and Exteroceptive Sensors,” inExperimental Robotics, 2014, pp. 195–209.

[18] J. Kelly, N. Roy, and G. S. Sukhatme, “Determining the Time DelayBetween Inertial and Visual Sensor Measurements,” IEEE Transac-tions on Robotics, vol. 30, no. 6, pp. 1514–1523, 2014.

[19] A. Segal, D. Haehnel, and S. Thrun, “Generalized-icp,” in Robotics:science and systems, 2009, p. 435.

[20] P. Biber and W. Straßer, “The normal distributions transform: A newapproach to laser scan matching,” in Proceedings 2003 IEEE/RSJInternational Conference on Intelligent Robots and Systems (IROS2003)(Cat. No. 03CH37453), vol. 3. IEEE, 2003, pp. 2743–2748.

[21] E. Meijering, “A chronology of interpolation: from ancient astronomyto modern signal and image processing,” Proceedings of the IEEE,vol. 90, no. 3, pp. 319–342, 2002.

[22] S. Agarwal, K. Mierle, and Others, “Ceres Solver,” http://ceres-solver.org, 2016.

[23] L. Stanislas and T. Peynot, “Characterisation of the Delphi Electron-ically Scanning Radar for Robotics Applications,” in Proceedings ofthe Australasian Conference on Robotics and Automation, 2015.

9983