look around you: sequence-based radar place recognition

6
Look Around You: Sequence-based Radar Place Recognition with Learned Rotational Invariance Matthew Gadd, Daniele De Martini, and Paul Newman Oxford Robotics Institute, Dept. Engineering Science, University of Oxford, UK. {mattgadd,daniele,pnewman}@robots.ox.ac.uk ©2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. Abstract—This paper details an application which yields sig- nificant improvements to the adeptness of place recognition with Frequency-Modulated Continuous-Wave radar – a commercially promising sensor poised for exploitation in mobile autonomy. We show how a rotationally-invariant metric embedding for radar scans can be integrated into sequence-based trajectory matching systems typically applied to videos taken by visual sensors. Due to the complete horizontal field of view inherent to the radar scan formation process, we show how this off-the-shelf sequence-based trajectory matching system can be manipulated to detect place matches when the vehicle is travelling down a previously visited stretch of road in the opposite direction. We demonstrate the efficacy of the approach on 26 km of challenging urban driving taken from the largest radar-focused urban autonomy dataset released to date – showing a boost of 30 % in recall at high levels of precision over a nearest neighbour approach. Index Terms—radar, localisation, place recognition, deep learn- ing, metric learning I. I NTRODUCTION In order for autonomous vehicles to travel safely at higher speeds or operate in wide-open spaces where there is a dearth of distinct features, a new level of robust sensing is required. Frequency-Modulated Continuous-Wave (FMCW) radar satisfies these requirements, thriving in all environmental conditions (rain, snow, dust, fog, or direct sunlight), providing a 360° view of the scene, and detecting targets at ranges of up to hundreds of metres with centimetre-scale precision. Indeed, there is a burgeoning interest in exploiting FMCW radar to enable robust mobile autonomy, including ego-motion estima- tion [1]–[6], localisation [7], [8], and scene understanding [9]. Figure 1 shows an overview of the pipeline proposed in this paper which extends our recent work in extremely robust radar-only place recognition [7] in which a metric space for embedding polar radar scans was learned, facilitating topolog- ical localisation using nearest neighbour (NN) matching. We show that this learned metric space can be leveraged within a sequenced-based topological localisation framework to bolster matching performance by both mitigating visual similarities that are caused by the planarity of the sensor and failures due to sudden obstruction in dynamic environments. Due to the complete horizontal field of view (FoV) of the radar scan formation process, we show how the off-the-shelf sequence- based trajectory matching system can be manipulated to allow us to detect place matches when the vehicle is travelling down a previously visited stretch of road in the opposite direction. This paper proceeds by reviewing related literature in Sec- Train tripl kRadar metric space Map traj time metric space time Live traj kRadar Training Place recognition Figure 1: An overview of our pipeline. The offline stages include enforcing a metric space by training a Fully Con- volutional Neural Network (FCNN) which takes polar radar scans as input, and encoding a trajectory of scans (the map) by forward passes through this network (c.f. Section III-B). The online stages involve inference to represent the place the robot currently finds itself within in terms of the learned knowledge and querying the space (c.f. Section III-A) which – in contrast to our prior work – involves a search for coherent sequences of matches rather than a globally closest frame in the embedding space. tion II. Section III describes our approach for a more canny use of a metric space in which polar radar scans are embedded. We describe in Section IV details for implementation, evaluation, and our dataset. Section V discusses results from such an evaluation. Sections VI and VII summarise the findings and suggest further avenues for investigation. II. RELATED WORK Recent work has shown the promise of FMCW radar for robust place recognition [7], [10] and metric localisation [6]. None of these methods account for temporal effects in the radar measurement stream. SeqSLAM [11] and its variants have been extremely suc- cessful at tackling large-scale, robust place recognition with arXiv:2003.04699v1 [cs.RO] 10 Mar 2020

Upload: others

Post on 12-Jan-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

Look Around You: Sequence-based Radar PlaceRecognition with Learned Rotational Invariance

Matthew Gadd, Daniele De Martini, and Paul NewmanOxford Robotics Institute, Dept. Engineering Science, University of Oxford, UK.

{mattgadd,daniele,pnewman}@robots.ox.ac.uk

©2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, includingreprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, orreuse of any copyrighted component of this work in other works.

Abstract—This paper details an application which yields sig-nificant improvements to the adeptness of place recognition withFrequency-Modulated Continuous-Wave radar – a commerciallypromising sensor poised for exploitation in mobile autonomy. Weshow how a rotationally-invariant metric embedding for radarscans can be integrated into sequence-based trajectory matchingsystems typically applied to videos taken by visual sensors. Dueto the complete horizontal field of view inherent to the radar scanformation process, we show how this off-the-shelf sequence-basedtrajectory matching system can be manipulated to detect placematches when the vehicle is travelling down a previously visitedstretch of road in the opposite direction. We demonstrate theefficacy of the approach on 26km of challenging urban drivingtaken from the largest radar-focused urban autonomy datasetreleased to date – showing a boost of 30% in recall at highlevels of precision over a nearest neighbour approach.

Index Terms—radar, localisation, place recognition, deep learn-ing, metric learning

I. INTRODUCTION

In order for autonomous vehicles to travel safely at higherspeeds or operate in wide-open spaces where there is adearth of distinct features, a new level of robust sensingis required. Frequency-Modulated Continuous-Wave (FMCW)radar satisfies these requirements, thriving in all environmentalconditions (rain, snow, dust, fog, or direct sunlight), providinga 360° view of the scene, and detecting targets at ranges of upto hundreds of metres with centimetre-scale precision. Indeed,there is a burgeoning interest in exploiting FMCW radar toenable robust mobile autonomy, including ego-motion estima-tion [1]–[6], localisation [7], [8], and scene understanding [9].

Figure 1 shows an overview of the pipeline proposed inthis paper which extends our recent work in extremely robustradar-only place recognition [7] in which a metric space forembedding polar radar scans was learned, facilitating topolog-ical localisation using nearest neighbour (NN) matching. Weshow that this learned metric space can be leveraged within asequenced-based topological localisation framework to bolstermatching performance by both mitigating visual similaritiesthat are caused by the planarity of the sensor and failuresdue to sudden obstruction in dynamic environments. Due tothe complete horizontal field of view (FoV) of the radar scanformation process, we show how the off-the-shelf sequence-based trajectory matching system can be manipulated to allowus to detect place matches when the vehicle is travelling downa previously visited stretch of road in the opposite direction.

This paper proceeds by reviewing related literature in Sec-

Train tripl<latexit sha1_base64="88eqiDb7FDY1sSAR4MaFf7pi3oA=">AAAB8nicbZBNS8NAEIYn9avWr6pHL4tF8FSSXvRY8CKeKvQL0lA22027dLMJuxOhhP4MLx4U8eqv8ea/cdvmoK0vLDy8M8POvGEqhUHX/XZKW9s7u3vl/crB4dHxSfX0rGuSTDPeYYlMdD+khkuheAcFSt5PNadxKHkvnN4t6r0nro1IVBtnKQ9iOlYiEoyitfy2pkIR1CKVw2rNrbtLkU3wCqhBodaw+jUYJSyLuUImqTG+56YY5FSjYJLPK4PM8JSyKR1z36KiMTdBvlx5Tq6sMyJRou1TSJbu74mcxsbM4tB2xhQnZr22MP+r+RlGt0EuVJohV2z1UZRJgglZ3E9GQnOGcmaBMi3sroRNqKYMbUoVG4K3fvImdBt1z/Jjo9Z8KOIowwVcwjV4cANNuIcWdIBBAs/wCm8OOi/Ou/Oxai05xcw5/JHz+QM3TpE3</latexit><latexit sha1_base64="88eqiDb7FDY1sSAR4MaFf7pi3oA=">AAAB8nicbZBNS8NAEIYn9avWr6pHL4tF8FSSXvRY8CKeKvQL0lA22027dLMJuxOhhP4MLx4U8eqv8ea/cdvmoK0vLDy8M8POvGEqhUHX/XZKW9s7u3vl/crB4dHxSfX0rGuSTDPeYYlMdD+khkuheAcFSt5PNadxKHkvnN4t6r0nro1IVBtnKQ9iOlYiEoyitfy2pkIR1CKVw2rNrbtLkU3wCqhBodaw+jUYJSyLuUImqTG+56YY5FSjYJLPK4PM8JSyKR1z36KiMTdBvlx5Tq6sMyJRou1TSJbu74mcxsbM4tB2xhQnZr22MP+r+RlGt0EuVJohV2z1UZRJgglZ3E9GQnOGcmaBMi3sroRNqKYMbUoVG4K3fvImdBt1z/Jjo9Z8KOIowwVcwjV4cANNuIcWdIBBAs/wCm8OOi/Ou/Oxai05xcw5/JHz+QM3TpE3</latexit><latexit sha1_base64="88eqiDb7FDY1sSAR4MaFf7pi3oA=">AAAB8nicbZBNS8NAEIYn9avWr6pHL4tF8FSSXvRY8CKeKvQL0lA22027dLMJuxOhhP4MLx4U8eqv8ea/cdvmoK0vLDy8M8POvGEqhUHX/XZKW9s7u3vl/crB4dHxSfX0rGuSTDPeYYlMdD+khkuheAcFSt5PNadxKHkvnN4t6r0nro1IVBtnKQ9iOlYiEoyitfy2pkIR1CKVw2rNrbtLkU3wCqhBodaw+jUYJSyLuUImqTG+56YY5FSjYJLPK4PM8JSyKR1z36KiMTdBvlx5Tq6sMyJRou1TSJbu74mcxsbM4tB2xhQnZr22MP+r+RlGt0EuVJohV2z1UZRJgglZ3E9GQnOGcmaBMi3sroRNqKYMbUoVG4K3fvImdBt1z/Jjo9Z8KOIowwVcwjV4cANNuIcWdIBBAs/wCm8OOi/Ou/Oxai05xcw5/JHz+QM3TpE3</latexit><latexit sha1_base64="88eqiDb7FDY1sSAR4MaFf7pi3oA=">AAAB8nicbZBNS8NAEIYn9avWr6pHL4tF8FSSXvRY8CKeKvQL0lA22027dLMJuxOhhP4MLx4U8eqv8ea/cdvmoK0vLDy8M8POvGEqhUHX/XZKW9s7u3vl/crB4dHxSfX0rGuSTDPeYYlMdD+khkuheAcFSt5PNadxKHkvnN4t6r0nro1IVBtnKQ9iOlYiEoyitfy2pkIR1CKVw2rNrbtLkU3wCqhBodaw+jUYJSyLuUImqTG+56YY5FSjYJLPK4PM8JSyKR1z36KiMTdBvlx5Tq6sMyJRou1TSJbu74mcxsbM4tB2xhQnZr22MP+r+RlGt0EuVJohV2z1UZRJgglZ3E9GQnOGcmaBMi3sroRNqKYMbUoVG4K3fvImdBt1z/Jjo9Z8KOIowwVcwjV4cANNuIcWdIBBAs/wCm8OOi/Ou/Oxai05xcw5/JHz+QM3TpE3</latexit>

kRadar<latexit sha1_base64="+i8DICsn56LGIx3haPcChOF5AE4=">AAAB7XicbZBNS8NAEIYnftb6VfXoZbEInkrSix4LXsRTFfsBbSiTzaZdu9mE3Y1QQv+DFw+KePX/ePPfuG1z0NYXFh7emWFn3iAVXBvX/XbW1jc2t7ZLO+Xdvf2Dw8rRcVsnmaKsRRORqG6AmgkuWctwI1g3VQzjQLBOML6e1TtPTGmeyAczSZkf41DyiFM01mqP7zFENahU3Zo7F1kFr4AqFGoOKl/9MKFZzKShArXueW5q/ByV4VSwabmfaZYiHeOQ9SxKjJn28/m2U3JunZBEibJPGjJ3f0/kGGs9iQPbGaMZ6eXazPyv1stMdOXnXKaZYZIuPooyQUxCZqeTkCtGjZhYQKq43ZXQESqkxgZUtiF4yyevQrte8yzf1auN2yKOEpzCGVyAB5fQgBtoQgsoPMIzvMKbkzgvzrvzsWhdc4qZE/gj5/MHeWaPEQ==</latexit><latexit sha1_base64="+i8DICsn56LGIx3haPcChOF5AE4=">AAAB7XicbZBNS8NAEIYnftb6VfXoZbEInkrSix4LXsRTFfsBbSiTzaZdu9mE3Y1QQv+DFw+KePX/ePPfuG1z0NYXFh7emWFn3iAVXBvX/XbW1jc2t7ZLO+Xdvf2Dw8rRcVsnmaKsRRORqG6AmgkuWctwI1g3VQzjQLBOML6e1TtPTGmeyAczSZkf41DyiFM01mqP7zFENahU3Zo7F1kFr4AqFGoOKl/9MKFZzKShArXueW5q/ByV4VSwabmfaZYiHeOQ9SxKjJn28/m2U3JunZBEibJPGjJ3f0/kGGs9iQPbGaMZ6eXazPyv1stMdOXnXKaZYZIuPooyQUxCZqeTkCtGjZhYQKq43ZXQESqkxgZUtiF4yyevQrte8yzf1auN2yKOEpzCGVyAB5fQgBtoQgsoPMIzvMKbkzgvzrvzsWhdc4qZE/gj5/MHeWaPEQ==</latexit><latexit sha1_base64="+i8DICsn56LGIx3haPcChOF5AE4=">AAAB7XicbZBNS8NAEIYnftb6VfXoZbEInkrSix4LXsRTFfsBbSiTzaZdu9mE3Y1QQv+DFw+KePX/ePPfuG1z0NYXFh7emWFn3iAVXBvX/XbW1jc2t7ZLO+Xdvf2Dw8rRcVsnmaKsRRORqG6AmgkuWctwI1g3VQzjQLBOML6e1TtPTGmeyAczSZkf41DyiFM01mqP7zFENahU3Zo7F1kFr4AqFGoOKl/9MKFZzKShArXueW5q/ByV4VSwabmfaZYiHeOQ9SxKjJn28/m2U3JunZBEibJPGjJ3f0/kGGs9iQPbGaMZ6eXazPyv1stMdOXnXKaZYZIuPooyQUxCZqeTkCtGjZhYQKq43ZXQESqkxgZUtiF4yyevQrte8yzf1auN2yKOEpzCGVyAB5fQgBtoQgsoPMIzvMKbkzgvzrvzsWhdc4qZE/gj5/MHeWaPEQ==</latexit><latexit sha1_base64="+i8DICsn56LGIx3haPcChOF5AE4=">AAAB7XicbZBNS8NAEIYnftb6VfXoZbEInkrSix4LXsRTFfsBbSiTzaZdu9mE3Y1QQv+DFw+KePX/ePPfuG1z0NYXFh7emWFn3iAVXBvX/XbW1jc2t7ZLO+Xdvf2Dw8rRcVsnmaKsRRORqG6AmgkuWctwI1g3VQzjQLBOML6e1TtPTGmeyAczSZkf41DyiFM01mqP7zFENahU3Zo7F1kFr4AqFGoOKl/9MKFZzKShArXueW5q/ByV4VSwabmfaZYiHeOQ9SxKjJn28/m2U3JunZBEibJPGjJ3f0/kGGs9iQPbGaMZ6eXazPyv1stMdOXnXKaZYZIuPooyQUxCZqeTkCtGjZhYQKq43ZXQESqkxgZUtiF4yyevQrte8yzf1auN2yKOEpzCGVyAB5fQgBtoQgsoPMIzvMKbkzgvzrvzsWhdc4qZE/gj5/MHeWaPEQ==</latexit>

metric space<latexit sha1_base64="S46mfRKtWHy8DH1gHEBqtuYJ7js=">AAAB83icbZDLSgMxFIYzXmu9VV26CRbBVZnpRpcFN+Kqgr1AO5RMeqYNTTJDckYoQ1/DjQtF3Poy7nwb03YW2nog8PH/5yQnf5RKYdH3v72Nza3tnd3SXnn/4PDouHJy2rZJZji0eCIT042YBSk0tFCghG5qgKlIQiea3M79zhMYKxL9iNMUQsVGWsSCM3RSXwEawalNGYdBperX/EXRdQgKqJKimoPKV3+Y8EyBRi6Ztb3ATzHMmUHBJczK/cyCu3jCRtBzqJkCG+aLnWf00ilDGifGHY10of6eyJmydqoi16kYju2qNxf/83oZxjdhLnSaIWi+fCjOJMWEzgOgQ2GAo5w6YNwItyvlY2YYRxdT2YUQrH55Hdr1WuD4oV5t3BdxlMg5uSBXJCDXpEHuSJO0CCcpeSav5M3LvBfv3ftYtm54xcwZ+VPe5w/97JGo</latexit><latexit sha1_base64="S46mfRKtWHy8DH1gHEBqtuYJ7js=">AAAB83icbZDLSgMxFIYzXmu9VV26CRbBVZnpRpcFN+Kqgr1AO5RMeqYNTTJDckYoQ1/DjQtF3Poy7nwb03YW2nog8PH/5yQnf5RKYdH3v72Nza3tnd3SXnn/4PDouHJy2rZJZji0eCIT042YBSk0tFCghG5qgKlIQiea3M79zhMYKxL9iNMUQsVGWsSCM3RSXwEawalNGYdBperX/EXRdQgKqJKimoPKV3+Y8EyBRi6Ztb3ATzHMmUHBJczK/cyCu3jCRtBzqJkCG+aLnWf00ilDGifGHY10of6eyJmydqoi16kYju2qNxf/83oZxjdhLnSaIWi+fCjOJMWEzgOgQ2GAo5w6YNwItyvlY2YYRxdT2YUQrH55Hdr1WuD4oV5t3BdxlMg5uSBXJCDXpEHuSJO0CCcpeSav5M3LvBfv3ftYtm54xcwZ+VPe5w/97JGo</latexit><latexit sha1_base64="S46mfRKtWHy8DH1gHEBqtuYJ7js=">AAAB83icbZDLSgMxFIYzXmu9VV26CRbBVZnpRpcFN+Kqgr1AO5RMeqYNTTJDckYoQ1/DjQtF3Poy7nwb03YW2nog8PH/5yQnf5RKYdH3v72Nza3tnd3SXnn/4PDouHJy2rZJZji0eCIT042YBSk0tFCghG5qgKlIQiea3M79zhMYKxL9iNMUQsVGWsSCM3RSXwEawalNGYdBperX/EXRdQgKqJKimoPKV3+Y8EyBRi6Ztb3ATzHMmUHBJczK/cyCu3jCRtBzqJkCG+aLnWf00ilDGifGHY10of6eyJmydqoi16kYju2qNxf/83oZxjdhLnSaIWi+fCjOJMWEzgOgQ2GAo5w6YNwItyvlY2YYRxdT2YUQrH55Hdr1WuD4oV5t3BdxlMg5uSBXJCDXpEHuSJO0CCcpeSav5M3LvBfv3ftYtm54xcwZ+VPe5w/97JGo</latexit><latexit sha1_base64="S46mfRKtWHy8DH1gHEBqtuYJ7js=">AAAB83icbZDLSgMxFIYzXmu9VV26CRbBVZnpRpcFN+Kqgr1AO5RMeqYNTTJDckYoQ1/DjQtF3Poy7nwb03YW2nog8PH/5yQnf5RKYdH3v72Nza3tnd3SXnn/4PDouHJy2rZJZji0eCIT042YBSk0tFCghG5qgKlIQiea3M79zhMYKxL9iNMUQsVGWsSCM3RSXwEawalNGYdBperX/EXRdQgKqJKimoPKV3+Y8EyBRi6Ztb3ATzHMmUHBJczK/cyCu3jCRtBzqJkCG+aLnWf00ilDGifGHY10of6eyJmydqoi16kYju2qNxf/83oZxjdhLnSaIWi+fCjOJMWEzgOgQ2GAo5w6YNwItyvlY2YYRxdT2YUQrH55Hdr1WuD4oV5t3BdxlMg5uSBXJCDXpEHuSJO0CCcpeSav5M3LvBfv3ftYtm54xcwZ+VPe5w/97JGo</latexit>

Map traj<latexit sha1_base64="dh+MRSGsb/hHpGfgbO6JAWNvC6w=">AAAB73icbZBNS8NAEIYn9avWr6pHL4tF8FSSXvRY8CKCUMF+QBvKZLtp1242cXcjlNA/4cWDIl79O978N27bHLT1hYWHd2bYmTdIBNfGdb+dwtr6xuZWcbu0s7u3f1A+PGrpOFWUNWksYtUJUDPBJWsabgTrJIphFAjWDsZXs3r7iSnNY3lvJgnzIxxKHnKKxlqdW0yIUfjQL1fcqjsXWQUvhwrkavTLX71BTNOISUMFat313MT4GSrDqWDTUi/VLEE6xiHrWpQYMe1n832n5Mw6AxLGyj5pyNz9PZFhpPUkCmxnhGakl2sz879aNzXhpZ9xmaSGSbr4KEwFMTGZHU8GXDFqxMQCUsXtroSOUCE1NqKSDcFbPnkVWrWqZ/muVqnf5HEU4QRO4Rw8uIA6XEMDmkBBwDO8wpvz6Lw4787HorXg5DPH8EfO5w+wwo+/</latexit><latexit sha1_base64="dh+MRSGsb/hHpGfgbO6JAWNvC6w=">AAAB73icbZBNS8NAEIYn9avWr6pHL4tF8FSSXvRY8CKCUMF+QBvKZLtp1242cXcjlNA/4cWDIl79O978N27bHLT1hYWHd2bYmTdIBNfGdb+dwtr6xuZWcbu0s7u3f1A+PGrpOFWUNWksYtUJUDPBJWsabgTrJIphFAjWDsZXs3r7iSnNY3lvJgnzIxxKHnKKxlqdW0yIUfjQL1fcqjsXWQUvhwrkavTLX71BTNOISUMFat313MT4GSrDqWDTUi/VLEE6xiHrWpQYMe1n832n5Mw6AxLGyj5pyNz9PZFhpPUkCmxnhGakl2sz879aNzXhpZ9xmaSGSbr4KEwFMTGZHU8GXDFqxMQCUsXtroSOUCE1NqKSDcFbPnkVWrWqZ/muVqnf5HEU4QRO4Rw8uIA6XEMDmkBBwDO8wpvz6Lw4787HorXg5DPH8EfO5w+wwo+/</latexit><latexit sha1_base64="dh+MRSGsb/hHpGfgbO6JAWNvC6w=">AAAB73icbZBNS8NAEIYn9avWr6pHL4tF8FSSXvRY8CKCUMF+QBvKZLtp1242cXcjlNA/4cWDIl79O978N27bHLT1hYWHd2bYmTdIBNfGdb+dwtr6xuZWcbu0s7u3f1A+PGrpOFWUNWksYtUJUDPBJWsabgTrJIphFAjWDsZXs3r7iSnNY3lvJgnzIxxKHnKKxlqdW0yIUfjQL1fcqjsXWQUvhwrkavTLX71BTNOISUMFat313MT4GSrDqWDTUi/VLEE6xiHrWpQYMe1n832n5Mw6AxLGyj5pyNz9PZFhpPUkCmxnhGakl2sz879aNzXhpZ9xmaSGSbr4KEwFMTGZHU8GXDFqxMQCUsXtroSOUCE1NqKSDcFbPnkVWrWqZ/muVqnf5HEU4QRO4Rw8uIA6XEMDmkBBwDO8wpvz6Lw4787HorXg5DPH8EfO5w+wwo+/</latexit><latexit sha1_base64="dh+MRSGsb/hHpGfgbO6JAWNvC6w=">AAAB73icbZBNS8NAEIYn9avWr6pHL4tF8FSSXvRY8CKCUMF+QBvKZLtp1242cXcjlNA/4cWDIl79O978N27bHLT1hYWHd2bYmTdIBNfGdb+dwtr6xuZWcbu0s7u3f1A+PGrpOFWUNWksYtUJUDPBJWsabgTrJIphFAjWDsZXs3r7iSnNY3lvJgnzIxxKHnKKxlqdW0yIUfjQL1fcqjsXWQUvhwrkavTLX71BTNOISUMFat313MT4GSrDqWDTUi/VLEE6xiHrWpQYMe1n832n5Mw6AxLGyj5pyNz9PZFhpPUkCmxnhGakl2sz879aNzXhpZ9xmaSGSbr4KEwFMTGZHU8GXDFqxMQCUsXtroSOUCE1NqKSDcFbPnkVWrWqZ/muVqnf5HEU4QRO4Rw8uIA6XEMDmkBBwDO8wpvz6Lw4787HorXg5DPH8EfO5w+wwo+/</latexit>

time<latexit sha1_base64="8oeKxyQlO+WQj4iE/iLUNZupXGg=">AAAB63icbZBNSwMxEIZn/az1q+rRS7AInspuL3oseBFPFewHtEvJptM2NMkuSVYoS/+CFw+KePUPefPfmG73oK0vBB7emSEzb5QIbqzvf3sbm1vbO7ulvfL+weHRceXktG3iVDNssVjEuhtRg4IrbFluBXYTjVRGAjvR9HZR7zyhNjxWj3aWYCjpWPERZ9TmFpc4qFT9mp+LrENQQBUKNQeVr/4wZqlEZZmgxvQCP7FhRrXlTOC83E8NJpRN6Rh7DhWVaMIs33VOLp0zJKNYu6csyd3fExmVxsxk5DoltROzWluY/9V6qR3dhBlXSWpRseVHo1QQG5PF4WTINTIrZg4o09ztStiEasqsi6fsQghWT16Hdr0WOH6oVxv3RRwlOIcLuIIArqEBd9CEFjCYwDO8wpsnvRfv3ftYtm54xcwZ/JH3+QMw2Y5X</latexit><latexit sha1_base64="8oeKxyQlO+WQj4iE/iLUNZupXGg=">AAAB63icbZBNSwMxEIZn/az1q+rRS7AInspuL3oseBFPFewHtEvJptM2NMkuSVYoS/+CFw+KePUPefPfmG73oK0vBB7emSEzb5QIbqzvf3sbm1vbO7ulvfL+weHRceXktG3iVDNssVjEuhtRg4IrbFluBXYTjVRGAjvR9HZR7zyhNjxWj3aWYCjpWPERZ9TmFpc4qFT9mp+LrENQQBUKNQeVr/4wZqlEZZmgxvQCP7FhRrXlTOC83E8NJpRN6Rh7DhWVaMIs33VOLp0zJKNYu6csyd3fExmVxsxk5DoltROzWluY/9V6qR3dhBlXSWpRseVHo1QQG5PF4WTINTIrZg4o09ztStiEasqsi6fsQghWT16Hdr0WOH6oVxv3RRwlOIcLuIIArqEBd9CEFjCYwDO8wpsnvRfv3ftYtm54xcwZ/JH3+QMw2Y5X</latexit><latexit sha1_base64="8oeKxyQlO+WQj4iE/iLUNZupXGg=">AAAB63icbZBNSwMxEIZn/az1q+rRS7AInspuL3oseBFPFewHtEvJptM2NMkuSVYoS/+CFw+KePUPefPfmG73oK0vBB7emSEzb5QIbqzvf3sbm1vbO7ulvfL+weHRceXktG3iVDNssVjEuhtRg4IrbFluBXYTjVRGAjvR9HZR7zyhNjxWj3aWYCjpWPERZ9TmFpc4qFT9mp+LrENQQBUKNQeVr/4wZqlEZZmgxvQCP7FhRrXlTOC83E8NJpRN6Rh7DhWVaMIs33VOLp0zJKNYu6csyd3fExmVxsxk5DoltROzWluY/9V6qR3dhBlXSWpRseVHo1QQG5PF4WTINTIrZg4o09ztStiEasqsi6fsQghWT16Hdr0WOH6oVxv3RRwlOIcLuIIArqEBd9CEFjCYwDO8wpsnvRfv3ftYtm54xcwZ/JH3+QMw2Y5X</latexit><latexit sha1_base64="8oeKxyQlO+WQj4iE/iLUNZupXGg=">AAAB63icbZBNSwMxEIZn/az1q+rRS7AInspuL3oseBFPFewHtEvJptM2NMkuSVYoS/+CFw+KePUPefPfmG73oK0vBB7emSEzb5QIbqzvf3sbm1vbO7ulvfL+weHRceXktG3iVDNssVjEuhtRg4IrbFluBXYTjVRGAjvR9HZR7zyhNjxWj3aWYCjpWPERZ9TmFpc4qFT9mp+LrENQQBUKNQeVr/4wZqlEZZmgxvQCP7FhRrXlTOC83E8NJpRN6Rh7DhWVaMIs33VOLp0zJKNYu6csyd3fExmVxsxk5DoltROzWluY/9V6qR3dhBlXSWpRseVHo1QQG5PF4WTINTIrZg4o09ztStiEasqsi6fsQghWT16Hdr0WOH6oVxv3RRwlOIcLuIIArqEBd9CEFjCYwDO8wpsnvRfv3ftYtm54xcwZ/JH3+QMw2Y5X</latexit>

metric space<latexit sha1_base64="S46mfRKtWHy8DH1gHEBqtuYJ7js=">AAAB83icbZDLSgMxFIYzXmu9VV26CRbBVZnpRpcFN+Kqgr1AO5RMeqYNTTJDckYoQ1/DjQtF3Poy7nwb03YW2nog8PH/5yQnf5RKYdH3v72Nza3tnd3SXnn/4PDouHJy2rZJZji0eCIT042YBSk0tFCghG5qgKlIQiea3M79zhMYKxL9iNMUQsVGWsSCM3RSXwEawalNGYdBperX/EXRdQgKqJKimoPKV3+Y8EyBRi6Ztb3ATzHMmUHBJczK/cyCu3jCRtBzqJkCG+aLnWf00ilDGifGHY10of6eyJmydqoi16kYju2qNxf/83oZxjdhLnSaIWi+fCjOJMWEzgOgQ2GAo5w6YNwItyvlY2YYRxdT2YUQrH55Hdr1WuD4oV5t3BdxlMg5uSBXJCDXpEHuSJO0CCcpeSav5M3LvBfv3ftYtm54xcwZ+VPe5w/97JGo</latexit><latexit sha1_base64="S46mfRKtWHy8DH1gHEBqtuYJ7js=">AAAB83icbZDLSgMxFIYzXmu9VV26CRbBVZnpRpcFN+Kqgr1AO5RMeqYNTTJDckYoQ1/DjQtF3Poy7nwb03YW2nog8PH/5yQnf5RKYdH3v72Nza3tnd3SXnn/4PDouHJy2rZJZji0eCIT042YBSk0tFCghG5qgKlIQiea3M79zhMYKxL9iNMUQsVGWsSCM3RSXwEawalNGYdBperX/EXRdQgKqJKimoPKV3+Y8EyBRi6Ztb3ATzHMmUHBJczK/cyCu3jCRtBzqJkCG+aLnWf00ilDGifGHY10of6eyJmydqoi16kYju2qNxf/83oZxjdhLnSaIWi+fCjOJMWEzgOgQ2GAo5w6YNwItyvlY2YYRxdT2YUQrH55Hdr1WuD4oV5t3BdxlMg5uSBXJCDXpEHuSJO0CCcpeSav5M3LvBfv3ftYtm54xcwZ+VPe5w/97JGo</latexit><latexit sha1_base64="S46mfRKtWHy8DH1gHEBqtuYJ7js=">AAAB83icbZDLSgMxFIYzXmu9VV26CRbBVZnpRpcFN+Kqgr1AO5RMeqYNTTJDckYoQ1/DjQtF3Poy7nwb03YW2nog8PH/5yQnf5RKYdH3v72Nza3tnd3SXnn/4PDouHJy2rZJZji0eCIT042YBSk0tFCghG5qgKlIQiea3M79zhMYKxL9iNMUQsVGWsSCM3RSXwEawalNGYdBperX/EXRdQgKqJKimoPKV3+Y8EyBRi6Ztb3ATzHMmUHBJczK/cyCu3jCRtBzqJkCG+aLnWf00ilDGifGHY10of6eyJmydqoi16kYju2qNxf/83oZxjdhLnSaIWi+fCjOJMWEzgOgQ2GAo5w6YNwItyvlY2YYRxdT2YUQrH55Hdr1WuD4oV5t3BdxlMg5uSBXJCDXpEHuSJO0CCcpeSav5M3LvBfv3ftYtm54xcwZ+VPe5w/97JGo</latexit><latexit sha1_base64="S46mfRKtWHy8DH1gHEBqtuYJ7js=">AAAB83icbZDLSgMxFIYzXmu9VV26CRbBVZnpRpcFN+Kqgr1AO5RMeqYNTTJDckYoQ1/DjQtF3Poy7nwb03YW2nog8PH/5yQnf5RKYdH3v72Nza3tnd3SXnn/4PDouHJy2rZJZji0eCIT042YBSk0tFCghG5qgKlIQiea3M79zhMYKxL9iNMUQsVGWsSCM3RSXwEawalNGYdBperX/EXRdQgKqJKimoPKV3+Y8EyBRi6Ztb3ATzHMmUHBJczK/cyCu3jCRtBzqJkCG+aLnWf00ilDGifGHY10of6eyJmydqoi16kYju2qNxf/83oZxjdhLnSaIWi+fCjOJMWEzgOgQ2GAo5w6YNwItyvlY2YYRxdT2YUQrH55Hdr1WuD4oV5t3BdxlMg5uSBXJCDXpEHuSJO0CCcpeSav5M3LvBfv3ftYtm54xcwZ+VPe5w/97JGo</latexit>

time<latexit sha1_base64="8oeKxyQlO+WQj4iE/iLUNZupXGg=">AAAB63icbZBNSwMxEIZn/az1q+rRS7AInspuL3oseBFPFewHtEvJptM2NMkuSVYoS/+CFw+KePUPefPfmG73oK0vBB7emSEzb5QIbqzvf3sbm1vbO7ulvfL+weHRceXktG3iVDNssVjEuhtRg4IrbFluBXYTjVRGAjvR9HZR7zyhNjxWj3aWYCjpWPERZ9TmFpc4qFT9mp+LrENQQBUKNQeVr/4wZqlEZZmgxvQCP7FhRrXlTOC83E8NJpRN6Rh7DhWVaMIs33VOLp0zJKNYu6csyd3fExmVxsxk5DoltROzWluY/9V6qR3dhBlXSWpRseVHo1QQG5PF4WTINTIrZg4o09ztStiEasqsi6fsQghWT16Hdr0WOH6oVxv3RRwlOIcLuIIArqEBd9CEFjCYwDO8wpsnvRfv3ftYtm54xcwZ/JH3+QMw2Y5X</latexit><latexit sha1_base64="8oeKxyQlO+WQj4iE/iLUNZupXGg=">AAAB63icbZBNSwMxEIZn/az1q+rRS7AInspuL3oseBFPFewHtEvJptM2NMkuSVYoS/+CFw+KePUPefPfmG73oK0vBB7emSEzb5QIbqzvf3sbm1vbO7ulvfL+weHRceXktG3iVDNssVjEuhtRg4IrbFluBXYTjVRGAjvR9HZR7zyhNjxWj3aWYCjpWPERZ9TmFpc4qFT9mp+LrENQQBUKNQeVr/4wZqlEZZmgxvQCP7FhRrXlTOC83E8NJpRN6Rh7DhWVaMIs33VOLp0zJKNYu6csyd3fExmVxsxk5DoltROzWluY/9V6qR3dhBlXSWpRseVHo1QQG5PF4WTINTIrZg4o09ztStiEasqsi6fsQghWT16Hdr0WOH6oVxv3RRwlOIcLuIIArqEBd9CEFjCYwDO8wpsnvRfv3ftYtm54xcwZ/JH3+QMw2Y5X</latexit><latexit sha1_base64="8oeKxyQlO+WQj4iE/iLUNZupXGg=">AAAB63icbZBNSwMxEIZn/az1q+rRS7AInspuL3oseBFPFewHtEvJptM2NMkuSVYoS/+CFw+KePUPefPfmG73oK0vBB7emSEzb5QIbqzvf3sbm1vbO7ulvfL+weHRceXktG3iVDNssVjEuhtRg4IrbFluBXYTjVRGAjvR9HZR7zyhNjxWj3aWYCjpWPERZ9TmFpc4qFT9mp+LrENQQBUKNQeVr/4wZqlEZZmgxvQCP7FhRrXlTOC83E8NJpRN6Rh7DhWVaMIs33VOLp0zJKNYu6csyd3fExmVxsxk5DoltROzWluY/9V6qR3dhBlXSWpRseVHo1QQG5PF4WTINTIrZg4o09ztStiEasqsi6fsQghWT16Hdr0WOH6oVxv3RRwlOIcLuIIArqEBd9CEFjCYwDO8wpsnvRfv3ftYtm54xcwZ/JH3+QMw2Y5X</latexit><latexit sha1_base64="8oeKxyQlO+WQj4iE/iLUNZupXGg=">AAAB63icbZBNSwMxEIZn/az1q+rRS7AInspuL3oseBFPFewHtEvJptM2NMkuSVYoS/+CFw+KePUPefPfmG73oK0vBB7emSEzb5QIbqzvf3sbm1vbO7ulvfL+weHRceXktG3iVDNssVjEuhtRg4IrbFluBXYTjVRGAjvR9HZR7zyhNjxWj3aWYCjpWPERZ9TmFpc4qFT9mp+LrENQQBUKNQeVr/4wZqlEZZmgxvQCP7FhRrXlTOC83E8NJpRN6Rh7DhWVaMIs33VOLp0zJKNYu6csyd3fExmVxsxk5DoltROzWluY/9V6qR3dhBlXSWpRseVHo1QQG5PF4WTINTIrZg4o09ztStiEasqsi6fsQghWT16Hdr0WOH6oVxv3RRwlOIcLuIIArqEBd9CEFjCYwDO8wpsnvRfv3ftYtm54xcwZ/JH3+QMw2Y5X</latexit>

Live traj<latexit sha1_base64="1X1bB01BMxI8D0ehKWuMOkp1elU=">AAAB8HicbZA9TwJBEIb38AvxC7W02UhMrMgdjZYkNsZYYCKggQvZW+ZgZffusjtHQi78ChsLjbH159j5b1zgCgXfZJMn78xkZ94gkcKg6347hbX1jc2t4nZpZ3dv/6B8eNQycao5NHksY/0QMANSRNBEgRIeEg1MBRLawehqVm+PQRsRR/c4ScBXbBCJUHCG1nq8FWOgqNlTr1xxq+5cdBW8HCokV6NX/ur2Y54qiJBLZkzHcxP0M6ZRcAnTUjc1kDA+YgPoWIyYAuNn84Wn9Mw6fRrG2r4I6dz9PZExZcxEBbZTMRya5drM/K/WSTG89DMRJSlCxBcfhamkGNPZ9bQvNHCUEwuMa2F3pXzINONoMyrZELzlk1ehVat6lu9qlfpNHkeRnJBTck48ckHq5Jo0SJNwosgzeSVvjnZenHfnY9FacPKZY/JHzucPhfaQOw==</latexit><latexit sha1_base64="1X1bB01BMxI8D0ehKWuMOkp1elU=">AAAB8HicbZA9TwJBEIb38AvxC7W02UhMrMgdjZYkNsZYYCKggQvZW+ZgZffusjtHQi78ChsLjbH159j5b1zgCgXfZJMn78xkZ94gkcKg6347hbX1jc2t4nZpZ3dv/6B8eNQycao5NHksY/0QMANSRNBEgRIeEg1MBRLawehqVm+PQRsRR/c4ScBXbBCJUHCG1nq8FWOgqNlTr1xxq+5cdBW8HCokV6NX/ur2Y54qiJBLZkzHcxP0M6ZRcAnTUjc1kDA+YgPoWIyYAuNn84Wn9Mw6fRrG2r4I6dz9PZExZcxEBbZTMRya5drM/K/WSTG89DMRJSlCxBcfhamkGNPZ9bQvNHCUEwuMa2F3pXzINONoMyrZELzlk1ehVat6lu9qlfpNHkeRnJBTck48ckHq5Jo0SJNwosgzeSVvjnZenHfnY9FacPKZY/JHzucPhfaQOw==</latexit><latexit sha1_base64="1X1bB01BMxI8D0ehKWuMOkp1elU=">AAAB8HicbZA9TwJBEIb38AvxC7W02UhMrMgdjZYkNsZYYCKggQvZW+ZgZffusjtHQi78ChsLjbH159j5b1zgCgXfZJMn78xkZ94gkcKg6347hbX1jc2t4nZpZ3dv/6B8eNQycao5NHksY/0QMANSRNBEgRIeEg1MBRLawehqVm+PQRsRR/c4ScBXbBCJUHCG1nq8FWOgqNlTr1xxq+5cdBW8HCokV6NX/ur2Y54qiJBLZkzHcxP0M6ZRcAnTUjc1kDA+YgPoWIyYAuNn84Wn9Mw6fRrG2r4I6dz9PZExZcxEBbZTMRya5drM/K/WSTG89DMRJSlCxBcfhamkGNPZ9bQvNHCUEwuMa2F3pXzINONoMyrZELzlk1ehVat6lu9qlfpNHkeRnJBTck48ckHq5Jo0SJNwosgzeSVvjnZenHfnY9FacPKZY/JHzucPhfaQOw==</latexit><latexit sha1_base64="1X1bB01BMxI8D0ehKWuMOkp1elU=">AAAB8HicbZA9TwJBEIb38AvxC7W02UhMrMgdjZYkNsZYYCKggQvZW+ZgZffusjtHQi78ChsLjbH159j5b1zgCgXfZJMn78xkZ94gkcKg6347hbX1jc2t4nZpZ3dv/6B8eNQycao5NHksY/0QMANSRNBEgRIeEg1MBRLawehqVm+PQRsRR/c4ScBXbBCJUHCG1nq8FWOgqNlTr1xxq+5cdBW8HCokV6NX/ur2Y54qiJBLZkzHcxP0M6ZRcAnTUjc1kDA+YgPoWIyYAuNn84Wn9Mw6fRrG2r4I6dz9PZExZcxEBbZTMRya5drM/K/WSTG89DMRJSlCxBcfhamkGNPZ9bQvNHCUEwuMa2F3pXzINONoMyrZELzlk1ehVat6lu9qlfpNHkeRnJBTck48ckHq5Jo0SJNwosgzeSVvjnZenHfnY9FacPKZY/JHzucPhfaQOw==</latexit>

kRadar<latexit sha1_base64="+i8DICsn56LGIx3haPcChOF5AE4=">AAAB7XicbZBNS8NAEIYnftb6VfXoZbEInkrSix4LXsRTFfsBbSiTzaZdu9mE3Y1QQv+DFw+KePX/ePPfuG1z0NYXFh7emWFn3iAVXBvX/XbW1jc2t7ZLO+Xdvf2Dw8rRcVsnmaKsRRORqG6AmgkuWctwI1g3VQzjQLBOML6e1TtPTGmeyAczSZkf41DyiFM01mqP7zFENahU3Zo7F1kFr4AqFGoOKl/9MKFZzKShArXueW5q/ByV4VSwabmfaZYiHeOQ9SxKjJn28/m2U3JunZBEibJPGjJ3f0/kGGs9iQPbGaMZ6eXazPyv1stMdOXnXKaZYZIuPooyQUxCZqeTkCtGjZhYQKq43ZXQESqkxgZUtiF4yyevQrte8yzf1auN2yKOEpzCGVyAB5fQgBtoQgsoPMIzvMKbkzgvzrvzsWhdc4qZE/gj5/MHeWaPEQ==</latexit><latexit sha1_base64="+i8DICsn56LGIx3haPcChOF5AE4=">AAAB7XicbZBNS8NAEIYnftb6VfXoZbEInkrSix4LXsRTFfsBbSiTzaZdu9mE3Y1QQv+DFw+KePX/ePPfuG1z0NYXFh7emWFn3iAVXBvX/XbW1jc2t7ZLO+Xdvf2Dw8rRcVsnmaKsRRORqG6AmgkuWctwI1g3VQzjQLBOML6e1TtPTGmeyAczSZkf41DyiFM01mqP7zFENahU3Zo7F1kFr4AqFGoOKl/9MKFZzKShArXueW5q/ByV4VSwabmfaZYiHeOQ9SxKjJn28/m2U3JunZBEibJPGjJ3f0/kGGs9iQPbGaMZ6eXazPyv1stMdOXnXKaZYZIuPooyQUxCZqeTkCtGjZhYQKq43ZXQESqkxgZUtiF4yyevQrte8yzf1auN2yKOEpzCGVyAB5fQgBtoQgsoPMIzvMKbkzgvzrvzsWhdc4qZE/gj5/MHeWaPEQ==</latexit><latexit sha1_base64="+i8DICsn56LGIx3haPcChOF5AE4=">AAAB7XicbZBNS8NAEIYnftb6VfXoZbEInkrSix4LXsRTFfsBbSiTzaZdu9mE3Y1QQv+DFw+KePX/ePPfuG1z0NYXFh7emWFn3iAVXBvX/XbW1jc2t7ZLO+Xdvf2Dw8rRcVsnmaKsRRORqG6AmgkuWctwI1g3VQzjQLBOML6e1TtPTGmeyAczSZkf41DyiFM01mqP7zFENahU3Zo7F1kFr4AqFGoOKl/9MKFZzKShArXueW5q/ByV4VSwabmfaZYiHeOQ9SxKjJn28/m2U3JunZBEibJPGjJ3f0/kGGs9iQPbGaMZ6eXazPyv1stMdOXnXKaZYZIuPooyQUxCZqeTkCtGjZhYQKq43ZXQESqkxgZUtiF4yyevQrte8yzf1auN2yKOEpzCGVyAB5fQgBtoQgsoPMIzvMKbkzgvzrvzsWhdc4qZE/gj5/MHeWaPEQ==</latexit><latexit sha1_base64="+i8DICsn56LGIx3haPcChOF5AE4=">AAAB7XicbZBNS8NAEIYnftb6VfXoZbEInkrSix4LXsRTFfsBbSiTzaZdu9mE3Y1QQv+DFw+KePX/ePPfuG1z0NYXFh7emWFn3iAVXBvX/XbW1jc2t7ZLO+Xdvf2Dw8rRcVsnmaKsRRORqG6AmgkuWctwI1g3VQzjQLBOML6e1TtPTGmeyAczSZkf41DyiFM01mqP7zFENahU3Zo7F1kFr4AqFGoOKl/9MKFZzKShArXueW5q/ByV4VSwabmfaZYiHeOQ9SxKjJn28/m2U3JunZBEibJPGjJ3f0/kGGs9iQPbGaMZ6eXazPyv1stMdOXnXKaZYZIuPooyQUxCZqeTkCtGjZhYQKq43ZXQESqkxgZUtiF4yyevQrte8yzf1auN2yKOEpzCGVyAB5fQgBtoQgsoPMIzvMKbkzgvzrvzsWhdc4qZE/gj5/MHeWaPEQ==</latexit>

Training<latexit sha1_base64="GJGmCQqSQUlnjPQg9jXGbDBUXxU=">AAAB73icbZA9SwNBEIbn4leMX1FLm8UgWIW7NKYM2IhVhHxBcoS9zVyyZG/v3N0TwpE/YWOhiK1/x85/4ya5QhNfWHh4Z4adeYNEcG1c99spbG3v7O4V90sHh0fHJ+XTs46OU8WwzWIRq15ANQousW24EdhLFNIoENgNpreLevcJleaxbJlZgn5Ex5KHnFFjrV5LUS65HA/LFbfqLkU2wcuhArmaw/LXYBSzNEJpmKBa9z03MX5GleFM4Lw0SDUmlE3pGPsWJY1Q+9ly3zm5ss6IhLGyTxqydH9PZDTSehYFtjOiZqLXawvzv1o/NWHdz7hMUoOSrT4KU0FMTBbHkxFXyIyYWaBMcbsrYROqKDM2opINwVs/eRM6tapn+aFWadzncRThAi7hGjy4gQbcQRPawEDAM7zCm/PovDjvzseqteDkM+fwR87nDybekAw=</latexit><latexit sha1_base64="GJGmCQqSQUlnjPQg9jXGbDBUXxU=">AAAB73icbZA9SwNBEIbn4leMX1FLm8UgWIW7NKYM2IhVhHxBcoS9zVyyZG/v3N0TwpE/YWOhiK1/x85/4ya5QhNfWHh4Z4adeYNEcG1c99spbG3v7O4V90sHh0fHJ+XTs46OU8WwzWIRq15ANQousW24EdhLFNIoENgNpreLevcJleaxbJlZgn5Ex5KHnFFjrV5LUS65HA/LFbfqLkU2wcuhArmaw/LXYBSzNEJpmKBa9z03MX5GleFM4Lw0SDUmlE3pGPsWJY1Q+9ly3zm5ss6IhLGyTxqydH9PZDTSehYFtjOiZqLXawvzv1o/NWHdz7hMUoOSrT4KU0FMTBbHkxFXyIyYWaBMcbsrYROqKDM2opINwVs/eRM6tapn+aFWadzncRThAi7hGjy4gQbcQRPawEDAM7zCm/PovDjvzseqteDkM+fwR87nDybekAw=</latexit><latexit sha1_base64="GJGmCQqSQUlnjPQg9jXGbDBUXxU=">AAAB73icbZA9SwNBEIbn4leMX1FLm8UgWIW7NKYM2IhVhHxBcoS9zVyyZG/v3N0TwpE/YWOhiK1/x85/4ya5QhNfWHh4Z4adeYNEcG1c99spbG3v7O4V90sHh0fHJ+XTs46OU8WwzWIRq15ANQousW24EdhLFNIoENgNpreLevcJleaxbJlZgn5Ex5KHnFFjrV5LUS65HA/LFbfqLkU2wcuhArmaw/LXYBSzNEJpmKBa9z03MX5GleFM4Lw0SDUmlE3pGPsWJY1Q+9ly3zm5ss6IhLGyTxqydH9PZDTSehYFtjOiZqLXawvzv1o/NWHdz7hMUoOSrT4KU0FMTBbHkxFXyIyYWaBMcbsrYROqKDM2opINwVs/eRM6tapn+aFWadzncRThAi7hGjy4gQbcQRPawEDAM7zCm/PovDjvzseqteDkM+fwR87nDybekAw=</latexit><latexit sha1_base64="GJGmCQqSQUlnjPQg9jXGbDBUXxU=">AAAB73icbZA9SwNBEIbn4leMX1FLm8UgWIW7NKYM2IhVhHxBcoS9zVyyZG/v3N0TwpE/YWOhiK1/x85/4ya5QhNfWHh4Z4adeYNEcG1c99spbG3v7O4V90sHh0fHJ+XTs46OU8WwzWIRq15ANQousW24EdhLFNIoENgNpreLevcJleaxbJlZgn5Ex5KHnFFjrV5LUS65HA/LFbfqLkU2wcuhArmaw/LXYBSzNEJpmKBa9z03MX5GleFM4Lw0SDUmlE3pGPsWJY1Q+9ly3zm5ss6IhLGyTxqydH9PZDTSehYFtjOiZqLXawvzv1o/NWHdz7hMUoOSrT4KU0FMTBbHkxFXyIyYWaBMcbsrYROqKDM2opINwVs/eRM6tapn+aFWadzncRThAi7hGjy4gQbcQRPawEDAM7zCm/PovDjvzseqteDkM+fwR87nDybekAw=</latexit>

Place recognition<latexit sha1_base64="XfN45O2o8xbIcMNd1sq+cRmcp0g=">AAAB+nicbVC7SgNBFL0bXzG+Ei1tBoNgFXbTaBmwEasI5gHJEmYnd5Mhs7PLzKwS1nyKjYUitn6JnX/jJNlCEw8MHM65lzvnBIng2rjut1PY2Nza3inulvb2Dw6PypXjto5TxbDFYhGrbkA1Ci6xZbgR2E0U0igQ2Akm13O/84BK81jem2mCfkRHkoecUWOlQbnSFJQhUchiqy+1qltzFyDrxMtJFXI0B+Wv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLr8/IuVWGJIyVfdKQhfp7I6OR1tMosJMRNWO96s3F/7xeasIrP+MySQ1KtjwUpoKYmMx7IENuMxsxtYQyZZMzwsZUUWZsWyVbgrcaeZ206zXP8rt6tXGb11GEUziDC/DgEhpwA01oAYNHeIZXeHOenBfn3flYjhacfOcE/sD5/AFAWZQB</latexit><latexit sha1_base64="XfN45O2o8xbIcMNd1sq+cRmcp0g=">AAAB+nicbVC7SgNBFL0bXzG+Ei1tBoNgFXbTaBmwEasI5gHJEmYnd5Mhs7PLzKwS1nyKjYUitn6JnX/jJNlCEw8MHM65lzvnBIng2rjut1PY2Nza3inulvb2Dw6PypXjto5TxbDFYhGrbkA1Ci6xZbgR2E0U0igQ2Akm13O/84BK81jem2mCfkRHkoecUWOlQbnSFJQhUchiqy+1qltzFyDrxMtJFXI0B+Wv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLr8/IuVWGJIyVfdKQhfp7I6OR1tMosJMRNWO96s3F/7xeasIrP+MySQ1KtjwUpoKYmMx7IENuMxsxtYQyZZMzwsZUUWZsWyVbgrcaeZ206zXP8rt6tXGb11GEUziDC/DgEhpwA01oAYNHeIZXeHOenBfn3flYjhacfOcE/sD5/AFAWZQB</latexit><latexit sha1_base64="XfN45O2o8xbIcMNd1sq+cRmcp0g=">AAAB+nicbVC7SgNBFL0bXzG+Ei1tBoNgFXbTaBmwEasI5gHJEmYnd5Mhs7PLzKwS1nyKjYUitn6JnX/jJNlCEw8MHM65lzvnBIng2rjut1PY2Nza3inulvb2Dw6PypXjto5TxbDFYhGrbkA1Ci6xZbgR2E0U0igQ2Akm13O/84BK81jem2mCfkRHkoecUWOlQbnSFJQhUchiqy+1qltzFyDrxMtJFXI0B+Wv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLr8/IuVWGJIyVfdKQhfp7I6OR1tMosJMRNWO96s3F/7xeasIrP+MySQ1KtjwUpoKYmMx7IENuMxsxtYQyZZMzwsZUUWZsWyVbgrcaeZ206zXP8rt6tXGb11GEUziDC/DgEhpwA01oAYNHeIZXeHOenBfn3flYjhacfOcE/sD5/AFAWZQB</latexit><latexit sha1_base64="XfN45O2o8xbIcMNd1sq+cRmcp0g=">AAAB+nicbVC7SgNBFL0bXzG+Ei1tBoNgFXbTaBmwEasI5gHJEmYnd5Mhs7PLzKwS1nyKjYUitn6JnX/jJNlCEw8MHM65lzvnBIng2rjut1PY2Nza3inulvb2Dw6PypXjto5TxbDFYhGrbkA1Ci6xZbgR2E0U0igQ2Akm13O/84BK81jem2mCfkRHkoecUWOlQbnSFJQhUchiqy+1qltzFyDrxMtJFXI0B+Wv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLr8/IuVWGJIyVfdKQhfp7I6OR1tMosJMRNWO96s3F/7xeasIrP+MySQ1KtjwUpoKYmMx7IENuMxsxtYQyZZMzwsZUUWZsWyVbgrcaeZ206zXP8rt6tXGb11GEUziDC/DgEhpwA01oAYNHeIZXeHOenBfn3flYjhacfOcE/sD5/AFAWZQB</latexit>

Figure 1: An overview of our pipeline. The offline stagesinclude enforcing a metric space by training a Fully Con-volutional Neural Network (FCNN) which takes polar radarscans as input, and encoding a trajectory of scans (the map) byforward passes through this network (c.f. Section III-B). Theonline stages involve inference to represent the place the robotcurrently finds itself within in terms of the learned knowledgeand querying the space (c.f. Section III-A) which – in contrastto our prior work – involves a search for coherent sequences ofmatches rather than a globally closest frame in the embeddingspace.

tion II. Section III describes our approach for a more canny useof a metric space in which polar radar scans are embedded. Wedescribe in Section IV details for implementation, evaluation,and our dataset. Section V discusses results from such anevaluation. Sections VI and VII summarise the findings andsuggest further avenues for investigation.

II. RELATED WORK

Recent work has shown the promise of FMCW radar forrobust place recognition [7], [10] and metric localisation [6].None of these methods account for temporal effects in theradar measurement stream.

SeqSLAM [11] and its variants have been extremely suc-cessful at tackling large-scale, robust place recognition with

arX

iv:2

003.

0469

9v1

[cs

.RO

] 1

0 M

ar 2

020

video imagery in the last decade. Progress along these lines hasincluded automatic scaling for viewpoint invariance [12], prob-abilistic adjustments to the search technique [13], and dealingwith challenging visual appearance change using GenerativeAdversarial Networks (GANs) [14].

The work presented in this paper is most closely influ-enced by the use of feature embeddings learned by trainingConvolutional Neural Networks (CNNs) [15], omnidirectionalcameras [16], and Light Detection and Ranging (LiDAR) [17]within the SeqSLAM framework.

III. METHODOLOGY

Broadly, our method can be summarised as leveraging veryrecent results in Deep Learning (DL) techniques which providegood metric embeddings for the global location of radar scanswithin a robust sequence-based trajectory matching system.We begin the discussion with a brief overview of the baselineSeqSLAM algorithm, followed by a light description of thelearned metric space, and concluded by an application whichunifies these systems – the main contribution of this paper.

A. Overview of SeqSLAM

Our implementation of the proposed system is based on anopen-source, publicly available port of the original algorithm1.

Incoming images are preprocessed by downsampling (tothumbnail resolution) and patch normalisation. A differencematrix is constructed storing the euclidean distance between allimage pairs. This difference matrix is then contrast enhanced.Examples of these matrices can be seen in ??. For more detail,a good summary is available in [18].

When looking for a match to a query image, SeqSLAMsweeps through the contrast-enhanced difference matrix to findthe best matching sequence of adjacent frames.

In the experiments (c.f. Sections IV and V) we refer to thisbaseline search as SEQSLAM.

B. Overview of Kidnapped Radar

To learn filters and cluster centres which help distin-guish polar radar images for place recognition we useNetVLAD [19] with VGG-16 [20] as a front-end feature ex-tractor – both popularly applied to the place recognition prob-lem. Importantly, we make alterations such that the networkinvariant to the orientation of input radar scans, including:circular padding [21], anti-aliasing blurring [22] azimuth-wisemax-pooling.

To enforce the metric space, we perform online triplet min-ing and apply the triplet loss described in [23]. Loop closurelabels are taken from a ground truth dataset (c.f. Section IV).

The interested reader is referred to [7] for more detail.In the experiments (c.f. Sections IV and V) we refer to

representations obtained in this manner as KRADAR.

C. Sequence-based Radar Place Recognition

We replace the image preprocessing step with inference onthe network described in Section III-B, resulting in radar scan

1https://github.com/tmadl/pySeqSLAM

Figure 2: The off-the-shelf sequence matching SeqSLAMsystem is manipulated in this paper to facilitate backwardsLoop Closure Detection (LCD). This is achieved by mirroringthe set, vmin < v < vmax (blue), of trajectories considered –also considering −vmax < v < −vmin (red). Importantly,this is not a useful adjustment under a naıve applicationof SeqSLAM to radar images and is only beneficial if arotationally invariant representation is used to construct thedifference matrices.

descriptors of size 4096.The difference matrix is obtained by calculating the Eu-

clidean distance between every pair of embeddings taken fromplaces along the reference and live trajectories in a windowof length W . This distance matrix is then locally contrastenhanced in sections of length R.

When searching for a match to a query image, we performa sweep through this contrast-enhanced difference matrix tofind the best matching sequence of frames based on the sumof sequence differences. In order to be able to detect matchesin reverse, this procedure is repeated with a time-reversed livetrajectory – this method would not be applicable to narrowFoV cameras but is appropriate here as the radar has a 360°FoV.

A simple visualisation of the process is shown in Figure 2.In tis case, the forward search (blue lines) is mirrored toperform a backwards search (red lines), which results inthe selection of the best match (solid black line). In theexperiments (c.f. Sections IV and V) we refer to this modifiedsearch as LAY (“Look Around You”).

This procedure is performed for each template, on whicha threshold is applied to select the best matches. Section V

Figure 3: Visualisation of a ground truth SE(2) matrix show-ing the Euclidean distance between the global positiosn thatpairs of radar scans were captured at. Each trajectory pair isassociated with such a matrix. Values in these matrices arescaled from distant (white) to nearby (black). In this region ofthe dataset, the vehicle revisits the same stretch of the route inthe opposite direction – visible as the contours perpendicularto the main diagonal.

discusses the application of the threshold and reports theresults in comparison to the original SeqSLAM approach; inparticular Figure 4 shows visual examples of the discussedmethodology.

IV. EXPERIMENTAL SETUP

This section details our experimental design in obtaining theresults to follow in Section V.

A. Vehicle and radar specifications

Data was collected using the Oxford RobotCar plat-form [24]. The vehicle, as described in the Oxford RadarRobotCar Dataset [25], is fitted with a CTS350-X NavtechFMCW scanning radar.

B. Ground truth database

The ground truth database is curated offline to capturethe sets of nodes that are at a maximum distance (15m)from a query frame, creating a graph-structured database thatyields triplets of nodes for training the representation discussedin Section III-B.

To this end, we adjust the accompanying ground truthodometry described in [25] in order to build a databaseof ground truth locations. We manually selected a momentduring which the vehicle was stationary at a common pointand trimmed each ground trace accordingly. We also alignedthe ground traces by introducing a small rotational offset toaccount for differing attitudes.

C. Trajectory reservationEach approximately 9 km trajectory in the Oxford city

centre was divided into three distinct portions: train, valid,and test.

The network is trained with ground truth topological local-isations between two reserved trajectories in the train split.

The test split, upon which the results presented in Section Vare based, was specifically selected to feature vehicle traversalsover portions of the route in the opposite direction; data fromthis split are not seen by the network during training.

The results focus on a teach-and-repeat (TR) scenario, inwhich all remaining trajectories in the dataset are localisedagainst a map built from the first trajectory that we did notuse for learning, totalling 27 trajectory pairs (and 26 km ofdriving) with the same map but a different localisation run.

D. Measuring performanceIn the ground truth SE(2) database, all locations within

a 15m radius of a ground truth location are considered truepositives whereas those outside are considered true negatives,a more strictly imposed boundary than in [7].

Evaluation of Precision-Recall (PR) is different for thesequence- and NN-based approaches. For the NN-based ap-proach of [7] we perform a ball search of the discretised metricspace out to a varying embedding distance threshold. For thesequence-based approach advocated in this paper, we vary theminimum match score.

As useful summaries of Precision-Recall (PR) performance,we analyse area-under-curve (AUC) as well as some F-scores,including F1, F2, and Fβ with β = 0.5 [26].

E. Hyperparameter tuningTo produce a fair comparison of the different configurations

we can utilise to solve the topological localisation problem, weperformed a hyperparameter tuning on the various algorithms;we selected two random trials and excluded them fro the finalevaluation. The window width for the trajectory evaluation Wand the enhancement window R have been chosen through agrid search procedure. The final values are the ones whichproduced precision-recall curves with the highest value ofprecision at 80% recall.

V. RESULTS

This section presents instrumentation of the metrics dis-cussed in Section IV-D.

The hyperparameter optimisation (c.f. Section IV-E) resultsin the parametisation of the systems for comparison as enu-merated in Table I.

Figure 6 shows a family of PR curves for various methodsthat it is possible to perform SeqSLAM with radar data. Here,only a single trajectory pair is considered (one as the maptrajectory, the other as the live trajectory). From Figure 6 it isevident that:

1) Performance when using the learned metric embeddingsis superior to either polar or cartesian radar scans,

2) Sequence-based matching of trajectories outperformsNN-based searches,

(a) (b) (c) (d)

(e) (f) (g) (h)

Figure 4: Difference matrices upon which the SeqSLAM variants perform trajectory searches (top row) and relative match-score-matrices (bottom row). These are constructed matching the representation of radar scans in two trajectories (rows-versus-columns for each matrix) – VGG-16/NETVLAD on the left side (a, b, e and f) and KRADAR on the right side (c, d, g and h).(a) and (c) are the difference matrices before enhancement – directly used by the NN search in [7] – and (b) and (d) are therespective enhanced form – on which SeqSLAM performs its searches. (e) and (f) are constructed using embeddings inferredby VGG-16/NETVLAD in the enhanced form (b), while (g) and (h) use embeddings inferred by KRADAR in the enhanced form(d). (e) and (g) are computed by using the forward-style match score method employed by standard SeqSLAM; in contrast,(f) and (h) employ the proposed backward-style match score method. All match-score matrices are not defined for the firstwindow of columns as SeqSLAM must fill a buffer of frames before any matching is possible.

Representation Search R W

VGG-16/NETVLAD SEQSLAM 34 50

KRADAR SEQSLAM 37 60

VGG-16/NETVLAD LAY 31 60

KRADAR LAY 37 60

Table I: Hyperparameter summary as results of the hyperpa-rameter grid-search optimisation.

3) Performance when using the baseline architecture is out-stripped by the rotationally-invariant modifications, and

4) Performance when using the modified search algorithmis boosted.

Observation 1 can be attributed to the fact that the learnedrepresentation is designed to encode only knowledge concern-ing place, whereas the imagery is subject to sensor artefacts.Observation 2 can be attributed to perceptual aliasing alongstraight, canyon-like sections of an urban trajectory being

mitigated. Observation 3 can be attributed to the rotationally-invariant architecture itself. Observation 4 can be attributed tothe ability of the adjusted search to detect loop closures inreverse.

Table II provides further evidence for these findings byaggregating PR-related statistics over the entirety of the datasetdiscussed in Section IV-C – the map trajectory is kept constantand the live trajectory varies over forays spanning a month ofurban driving.

While it is clear that we outperform NN techniques pre-sented in [7], the F-scores in Table II present a mixedresult when comparing the standard SeqSLAM search andthe modified search discussed in Section III-C. However,consider Figure 5. Here, the structure of the backwards loopclosures is discovered more readily by the backwards search.

It is important to remember when inspecting the resultsshown in Table II and Figure 6 that the data in this part ofthe route (c.f. Section IV-C) is unseen by the network duringtraining, and particularly challenging. This is a necessaryanalysis of the generalisation of learned place recognitionmethods but is not a requirement when deploying the learned

(a) (b)

Figure 5: Binarised match score matrices for the (a) baseline and (b) mirrored SeqSLAM variants. The threshold applied forbinarisation is higher for (a) (KRADAR,SEQSLAM) than for (KRADAR,LAY). This is in order to qualitatively show that evenwhen increasing numbers of potential matches are allowed in SEQSLAM (high recall), the true backwards loop closures arenot featured. For LAY (right), they are. From these it is evident that the tailored changes to the fundamental SeqSLAM searchstrategy are better suited to discover loop closures as the vehicle revisits the same route section with opposing orientation – acommon scenario in structured, urban driving.

0.0 0.2 0.4 0.6 0.8 1.0Recall

0.0

0.2

0.4

0.6

0.8

1.0

Pre

cisi

on

vgg-16/netvlad,NN

kRadar,NN

kRadar,LAY

vgg-16/netvlad,SeqSLAM

vgg-16/netvlad,LAY

kRadar,SeqSLAM

Figure 6: PR curves showing the benefit of, firstly, usinglearned metric embeddings as opposed to radar scans directlyand, secondly, the tailored changes to the baseline SeqSLAMsearch algorithm when performing sequence-based radar placerecognition.

knowledge in TR modes of autonomy.The takeaway message is that we have improved the recall

at good precision levels by about 30% by applying sequence-based place recognition techniques to our learned metric space.

VI. CONCLUSION

We have presented an application of recent advances inlearning representations for imagery obtained by radar scan

formation to sequence-based place recognition. The proposedsystem is based on a manipulation of off-the-shelf SeqS-LAM with prudent adjustments made taking into account thecomplete sweep made by scanning radar sensors. We havehave further proven the utility of our rotationally invariantarchitecture – a crucial enabling factor of our SeqSLAMvariant. Crucially, we achieve a boost of 30% in recall athigh levels of precision over our previously published nearestneighbour approach.

VII. FUTURE WORK

In the future we plan to retrain and test the system onthe all-weather platform described in [27], a signficant factorin the development of which was to explore applications ofFMCW radar to mobile autonomy in challenging, unstructuredenvironments. We also plan to integrate the system presentedin this paper with our mapping and localisation pipeline whichis built atop of the scan-matching algorithm of [28], [29].

ACKNOWLEDGMENT

This project is supported by the Assuring Autonomy Inter-national Programme, a partnership between Lloyds RegisterFoundation and the University of York as well as UK EPSRCprogramme grant EP/M019918/1. We would also like to thankour partners at Navtech radar.

REFERENCES

[1] S. H. Cen and P. Newman, “Precise ego-motion estimation withmillimeter-wave radar under diverse and challenging conditions,” in2018 IEEE International Conference on Robotics and Automation(ICRA). IEEE, 2018, pp. 1–8.

[2] S. H. Cen and P. Newman, “Radar-only ego-motion estimation indifficult settings via graph matching,” in 2019 International Conference

Representation Search AUC max F1 max F2 max F0.5 RP=60% RP=80%

VGG-16/NETVLAD NN 0.26 0.34 0.37 0.29 0.03 0.00

KRADAR NN 0.37 0.41 0.40 0.41 0.16 0.06

VGG-16/NETVLAD SEQSLAM 0.47 0.49 0.37 0.60 0.40 0.31

KRADAR SEQSLAM 0.52 0.53 0.41 0.64 0.45 0.37

VGG-16/NETVLAD LAY 0.46 0.48 0.36 0.60 0.39 0.32

KRADAR LAY 0.53 0.53 0.42 0.62 0.46 0.36

Table II: Summary statistics for various radar-only SeqSLAM techniques (including representation of the radar frame andstyle of search) as aggregated over a month of urban driving. All quantities are expressed as a mean value. As discussedin Section IV-D, the requirement imposed on matches (as true/false positives/negatives) is more strict than that presented in [7]with consequentially worse performance than previously published for VGG-16/NETVLAD KRADAR, and NN systems. Thekey message of this paper is that sequence-based exploitation of these learned metric embeddings (middle and bottom rows)is beneficial in comparison to NN matching in a discretised search space (top two rows).

on Robotics and Automation (ICRA). IEEE, 2019, pp. 298–304.[3] R. Aldera, D. De Martini, M. Gadd, and P. Newman, “Fast Radar Motion

Estimation with a Learnt Focus of Attention using Weak Supervision,”in IEEE International Conference on Robotics and Automation (ICRA),May 2019, pp. 1190–1196.

[4] R. Aldera, D. De Martini, M. Gadd, and P. Newman, “What Could GoWrong? Introspective Radar Odometry in Challenging Environments,” inIEEE Intelligent Transportation Systems Conference (ITSC), Auckland,New Zealand, October 2019.

[5] D. Barnes, R. Weston, and I. Posner, “Masking by Moving: LearningDistraction-Free Radar Odometry from Pose Information,” in Conferenceon Robot Learning (CoRL), 2019.

[6] D. Barnes and I. Posner, “Under the Radar: Learning to PredictRobust Keypoints for Odometry Estimation and Metric Localisation inRadar,” arXiv preprint arXiv: 2001.10789, 2020. [Online]. Available:https://arxiv.org/abs/2001.10789

[7] S, . Saftescu, M. Gadd, D. De Martini, D. Barnes, and P. Newman,“Kidnapped Radar: Topological Radar Localisation using Rotationally-Invariant Metric Learning,” arXiv preprint arXiv: 2001.09438, 2020.[Online]. Available: https://arxiv.org/abs/2001.09438

[8] T. Y. Tang, D. De Martini, D. Barnes, and P. Newman, “RSL-Net:Localising in Satellite Images From a Radar on the Ground,” arXivpreprint arXiv:2001.03233, 2020.

[9] R. Weston, S. Cen, P. Newman, and I. Posner, “Probably Unknown: DeepInverse Sensor Modelling Radar,” in 2019 International Conference onRobotics and Automation (ICRA). IEEE, 2019, pp. 5446–5452.

[10] G. Kim, Y. S. Park, Y. Cho, J. Jeong, and A. Kim, “MulRan: MultimodalRange Dataset for Urban Place Recognition,” in IEEE InternationalConference on Robotics and Automation (ICRA), 2020.

[11] M. J. Milford and G. F. Wyeth, “Seqslam: Visual route-based navigationfor sunny summer days and stormy winter nights,” in 2012 IEEEInternational Conference on Robotics and Automation. IEEE, 2012,pp. 1643–1649.

[12] E. Pepperell, P. I. Corke, and M. J. Milford, “Automatic image scalingfor place recognition in changing environments,” in 2015 IEEE Interna-tional Conference on Robotics and Automation (ICRA). IEEE, 2015,pp. 1118–1124.

[13] P. Hansen and B. Browning, “Visual place recognition using hmmsequence matching,” in 2014 IEEE/RSJ International Conference onIntelligent Robots and Systems. IEEE, 2014, pp. 4549–4555.

[14] Y. Latif, R. Garg, M. Milford, and I. Reid, “Addressing Challeng-ing Place Recognition Tasks using Generative Adversarial Networks,”in 2018 IEEE International Conference on Robotics and Automation(ICRA). IEEE, 2018, pp. 2349–2355.

[15] B. Dongdong, W. Chaoqun, B. Zhang, Y. Xiaodong, Y. Xuejun et al.,“Cnn feature boosted seqslam for real-time loop closure detection,”Chinese Journal of Electronics, vol. 27, no. 3, pp. 488–499, 2018.

[16] R. Cheng, K. Wang, S. Lin, W. Hu, K. Yang, X. Huang, H. Li,D. Sun, and J. Bai, “Panoramic annular localizer: Tackling the variationchallenges of outdoor localization using panoramic annular images

and active deep descriptors,” in 2019 IEEE Intelligent TransportationSystems Conference (ITSC). IEEE, 2019, pp. 920–925.

[17] P. Yin, Y. He, L. Xu, Y. Peng, J. Han, and W. Xu, “Synchronousadversarial feature learning for lidar based loop closure detection,” in2018 Annual American Control Conference (ACC). IEEE, 2018, pp.234–239.

[18] N. Sunderhauf, P. Neubert, and P. Protzel, “Are we there yet? challengingseqslam on a 3000 km journey across all four seasons,” in Proc. ofWorkshop on Long-Term Autonomy, IEEE International Conference onRobotics and Automation (ICRA), 2013, p. 2013.

[19] R. Arandjelovic, P. Gronat, A. Torii, T. Pajdla, and J. Sivic, “Netvlad:Cnn architecture for weakly supervised place recognition,” in Proceed-ings of the IEEE conference on computer vision and pattern recognition,2016, pp. 5297–5307.

[20] K. Simonyan and A. Zisserman, “Very deep convolutional networks forlarge-scale image recognition,” arXiv preprint arXiv:1409.1556, 2014.

[21] T.-H. Wang, H.-J. Huang, J.-T. Lin, C.-W. Hu, K.-H. Zeng, and M. Sun,“Omnidirectional CNN for Visual Place Recognition and Navigation,”in 2018 IEEE International Conference on Robotics and Automation(ICRA). IEEE, 2018, pp. 2341–2348.

[22] R. Zhang, “Making Convolutional Networks Shift-Invariant Again,” inInternational Conference on Machine Learning (ICML), 2019, pp. 7324–7334.

[23] F. Schroff, D. Kalenichenko, and J. Philbin, “Facenet: A unified embed-ding for face recognition and clustering,” in Proceedings of the IEEEconference on computer vision and pattern recognition, 2015, pp. 815–823.

[24] W. Maddern, G. Pascoe, C. Linegar, and P. Newman, “1 Year, 1000km:The Oxford RobotCar Dataset,” The International Journal of RoboticsResearch (IJRR), vol. 36, no. 1, pp. 3–15, 2017.

[25] D. Barnes, M. Gadd, P. Murcutt, P. Newman, and I. Posner, “TheOxford Radar RobotCar Dataset: A Radar Extension to the OxfordRobotCar Dataset,” arXiv preprint arXiv: 1909.01300, 2019. [Online].Available: https://arxiv.org/pdf/1909.01300

[26] J. A. Pino, “Modern information retrieval. ricardo baeza-yates y berthierribeiro-neto addison wesley hariow, england, 1999,” 1999.

[27] S. Kyberd, J. Attias, P. Get, P. Murcutt, C. Prahacs, M. Towlson, S. Venn,A. Vasconcelos, M. Gadd, D. De Martini, and P. Newman, “The Hulk:Design and Development of a Weather-proof Vehicle for Long-termAutonomy in Outdoor Environments,” in International Conference onField and Service Robotics (FSR), Tokyo, Japan, August 2019.

[28] S. H. Cen and P. Newman, “Precise ego-motion estimation withmillimeter-wave radar under diverse and challenging conditions,” Pro-ceedings of the 2018 IEEE International Conference on Robotics andAutomation, 2018.

[29] S. Cen and P. Newman, “Radar-only ego-motion estimation in difficultsettings via graph matching,” in Proceedings of the IEEE InternationalConference on Robotics and Automation (ICRA), Montreal, Canada,2019.