Visual Odometry / SLAM Evaluation 2012


The odometry benchmark consists of 22 stereo sequences, saved in loss less png format: We provide 11 sequences (00-10) with ground truth trajectories for training and 11 sequences (11-21) without ground truth for evaluation. For this benchmark you may provide results using monocular or stereo visual odometry, laser-based SLAM or algorithms that combine visual and LIDAR information. The only restriction we impose is that your method is fully automatic (e.g., no manual loop-closure tagging is allowed) and that the same parameter set is used for all sequences. A development kit provides details about the data format.

From all test sequences, our evaluation computes translational and rotational errors for all possible subsequences of length (100,...,800) meters. The evaluation table below ranks methods according to the average of those values, where errors are measured in percent (for translation) and in degrees per meter (for rotation). A more detailed comparison for different trajectory lengths and driving speeds can be found in the plots underneath. Note: On 03.10.2013 we have changed the evaluated sequence lengths from (5,10,50,100,...,400) to (100,200,...,800) due to the fact that the GPS/OXTS ground truth error for very small sub-sequences was large and hence biased the evaluation results. Now the averages below take into account longer sequences and provide a better indication of the true performance. Please consider reporting these number for all future submissions. The last leaderboard right before the changes can be found here!

Important Policy Update: As more and more non-published work and re-implementations of existing work is submitted to KITTI, we have established a new policy: from now on, only submissions with significant novelty that are leading to a peer-reviewed paper in a conference or journal are allowed. Minor modifications of existing algorithms or student research projects are not allowed. Such work must be evaluated on a split of the training set. To ensure that our policy is adopted, new users must detail their status, describe their work and specify the targeted venue during registration. Furthermore, we will regularly delete all entries that are 6 months old but are still anonymous or do not have a paper associated with them. For conferences, 6 month is enough to determine if a paper has been accepted and to add the bibliography information. For longer review cycles, you need to resubmit your results.
Additional information used by the methods
  • Stereo: Method uses left and right (stereo) images
  • Laser Points: Method uses point clouds from Velodyne laser scanner
  • Loop Closure Detection: This method is a SLAM method that detects loop closures
  • Additional training data: Use of additional data sources for training (see details)
Method Setting Code Translation Rotation Runtime Environment
1 SOFT2
This method uses stereo information.
0.53 % 0.0009 [deg/m] 0.1 s 4 cores @ 2.5 Ghz (C/C++)
I. Cvišić, I. Marković and I. Petrović: Recalibrating the KITTI Dataset Camera Setup for Improved Odometry Accuracy. European Conference on Mobile Robots (ECMR) 2021.
2 V-LOAM
This method makes use of Velodyne laser scans.
0.54 % 0.0013 [deg/m] 0.1 s 2 cores @ 2.5 Ghz (C/C++)
J. Zhang and S. Singh: Visual-lidar Odometry and Mapping: Low drift, Robust, and Fast. IEEE International Conference on Robotics and Automation(ICRA) 2015.
3 LOAM
This method makes use of Velodyne laser scans.
0.55 % 0.0013 [deg/m] 0.1 s 2 cores @ 2.5 Ghz (C/C++)
J. Zhang and S. Singh: LOAM: Lidar Odometry and Mapping in Real- time. Robotics: Science and Systems Conference (RSS) 2014.
4 TVL-SLAM+
This method uses stereo information.
This method makes use of Velodyne laser scans.
0.56 % 0.0015 [deg/m] 0.3 s 1 core @ 3.0 Ghz (C/C++)
C. Chou and C. Chou: Efficient and Accurate Tightly-Coupled Visual-Lidar SLAM. IEEE Transactions on Intelligent Transportation Systems “to appear”.
5 GLIM
This method makes use of Velodyne laser scans.
0.59 % 0.0015 [deg/m] 0.1 s GPU @ 2.5 Ghz (C/C++)
K. Koide, M. Yokozuka, S. Oishi and A. Banno: Globally Consistent 3D LiDAR Mapping with GPU-accelerated GICP Matching Cost Factors. IEEE Robotics and Automation Letters 2021.
6 CT-ICP
This method makes use of Velodyne laser scans.
code 0.59 % 0.0014 [deg/m] 0.06 s 1 core @ 3.5 Ghz (C/C++)
P. Dellenbach, J. Deschaud, B. Jacquet and F. Goulette: CT-ICP: Real-time Elastic LiDAR Odometry with Loop Closure. arXiv e-prints 2021.
7 DSV-LOAM
This method makes use of Velodyne laser scans.
0.60 % 0.0015 [deg/m] 0.1 s 4 cores @ 3.0 Ghz (C/C++)
8 HELO
This method makes use of Velodyne laser scans.
0.61 % 0.0018 [deg/m] 0.1 s 8 cores @ 2.5 Ghz (C/C++)
9 CT-ICP-test 0.62 % 0.0015 [deg/m] 0.06 s 1 core @ 2.5 Ghz (C/C++)
10 wPICP
This method makes use of Velodyne laser scans.
0.62 % 0.0015 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
11 zPICP
This method makes use of Velodyne laser scans.
0.62 % 0.0016 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
12 HMLO
This method makes use of Velodyne laser scans.
0.62 % 0.0014 [deg/m] 0.2s 1 core @ 2.5 Ghz (Matlab)
13 HMLO-whu
This method makes use of Velodyne laser scans.
0.63 % 0.0014 [deg/m] 0.2 s 1 core @ 2.5 Ghz (Matlab)
14 filter-reg 0.65 % 0.0016 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
15 SOFT-SLAM
This method uses stereo information.
0.65 % 0.0014 [deg/m] 0.1 s 2 cores @ 2.5 Ghz (C/C++)
I. Cvišić, J. Ćesić, I. Marković and I. Petrović: SOFT-SLAM: Computationally Efficient Stereo Visual SLAM for Autonomous UAVs. Journal of Field Robotics 2017.
16 MULLS
This method makes use of Velodyne laser scans.
code 0.65 % 0.0019 [deg/m] 0.08 s 4 cores @ 2.2 Ghz (C/C++)
Y. Pan, P. Xiao, Y. He, Z. Shao and Z. Li: MULLS: Versatile LiDAR SLAM via Multi- metric Linear Least Square. IEEE International Conference on Robotics and Automation (ICRA) 2021. .
17 PICP
This method makes use of Velodyne laser scans.
0.67 % 0.0018 [deg/m] 0.05 s 1 core @ 2.5 Ghz (C/C++)
18 ELO
This method makes use of Velodyne laser scans.
0.68 % 0.0021 [deg/m] 0.005 s GPU @ 2.6 Ghz (C/C++)(0.027s Jetson AGX)
X. Zheng and J. Zhu: Efficient LiDAR Odometry for Autonomous Driving. IEEE Robotics and Automation Letters(RA- L) 2021.
19 IMLS-SLAM
This method makes use of Velodyne laser scans.
0.69 % 0.0018 [deg/m] 1.25 s 1 core @ >3.5 Ghz (C/C++)
J. Deschaud: IMLS-SLAM: Scan-to-Model Matching Based on 3D Data. 2018 IEEE International Conference on Robotics and Automation (ICRA) 2018.
20 MC2SLAM
This method makes use of Velodyne laser scans.
0.69 % 0.0016 [deg/m] 0.1 s 4 cores @ 2.5 Ghz (C/C++)
F. Neuhaus, T. Koss, R. Kohnen and D. Paulus: MC2SLAM: Real-Time Inertial Lidar Odometry using Two-Scan Motion Compensation. German Conference on Pattern Recognition 2018.
21 FLOAM
This method makes use of Velodyne laser scans.
code 0.71 % 0.0022 [deg/m] 0.05 s 4 cores @ 3.0 Ghz (C/C++)
22 ICD 0.71 % 0.0018 [deg/m] 0.2 s 8 cores @ >3.5 Ghz (C/C++)
23 ISC-LOAM
This method makes use of Velodyne laser scans.
code 0.72 % 0.0022 [deg/m] 0.1 s 4 cores @ 3.0 Ghz (C/C++)
H. Wang, C. Wang and L. Xie: Intensity scan context: Coding intensity and geometry relations for loop closure detection. 2020 IEEE International Conference on Robotics and Automation (ICRA) 2020.
24 GLO 0.73 % 0.0022 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
25 TBD 0.77 % 0.0022 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
26 PSF-LO
This method makes use of Velodyne laser scans.
0.82 % 0.0032 [deg/m] 0.2s 4 cores @ 3.2 GHz
G. Chen, B. Wang, X. Wang, H. Deng, B. Wang and S. Zhang: PSF-LO: Parameterized Semantic Features Based Lidar Odometry. 2021 IEEE International Conference on Robotics and Automation (ICRA) 2021.
27 RADVO
This method uses stereo information.
0.82 % 0.0018 [deg/m] 0.07 s 1 core @ 3.0 Ghz (C/C++)
P. Bénet and A. Guinamard: Robust and Accurate Deterministic Visual Odometry. Proceedings of the 33rd International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2020) 2020.
28 LG-SLAM
This method uses stereo information.
0.82 % 0.0020 [deg/m] 0.2 s 4 cores @ 2.5 Ghz (C/C++)
K. Lenac, J. Ćesić, I. Marković and I. Petrović: Exactly sparse delayed state filter on Lie groups for long-term pose graph SLAM. The International Journal of Robotics Research 2018.
29 S4-SLAM2
This method makes use of Velodyne laser scans.
0.83 % 0.0097 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
30 F-pose 0.83 % 0.0024 [deg/m] 0.02 s 1 core @ 2.5 Ghz (Python)
31 RotRocc+
This method uses stereo information.
0.83 % 0.0026 [deg/m] 0.25 s 2 cores @ 2.0 Ghz (C/C++)
M. Buczko and V. Willert: Flow-Decoupled Normalized Reprojection Error for Visual Odometry. 19th IEEE Intelligent Transportation Systems Conference (ITSC) 2016.
M. Buczko, V. Willert, J. Schwehr and J. Adamy: Self-Validation for Automotive Visual Odometry. IEEE Intelligent Vehicles Symposium (IV) 2018.
M. Buczko: Automotive Visual Odometry. 2018.
32 LIMO2_GP
This method makes use of Velodyne laser scans.
code 0.84 % 0.0022 [deg/m] 0.2 s 2 cores @ 2.5 Ghz (C/C++)
J. Graeter, A. Wilczynski and M. Lauer: LIMO: Lidar-Monocular Visual Odometry. arXiv preprint arXiv:1807.07524 2018.
33 CAE-LO
This method makes use of Velodyne laser scans.
code 0.86 % 0.0025 [deg/m] 2 s 8 cores @ 3.5 Ghz (Python)
D. Yin, Q. Zhang, J. Liu, X. Liang, Y. Wang, J. Maanpää, H. Ma, J. Hyyppä and R. Chen: CAE-LO: LiDAR Odometry Leveraging Fully Unsupervised Convolutional Auto-Encoder for Interest Point Detection and Feature Description. 2020.
34 GDVO
This method uses stereo information.
0.86 % 0.0031 [deg/m] 0.09 s 1 core @ >3.5 Ghz (C/C++)
J. Zhu: Image Gradient-based Joint Direct Visual Odometry for Stereo Camera. International Joint Conference on Artificial Intelligence, IJCAI 2017.
35 LIMO2
This method makes use of Velodyne laser scans.
code 0.86 % 0.0022 [deg/m] 0.2 s 2 cores @ 2.5 Ghz (C/C++)
J. Graeter, A. Wilczynski and M. Lauer: LIMO: Lidar-Monocular Visual Odometry. arXiv preprint arXiv:1807.07524 2018.
36 CPFG-slam
This method makes use of Velodyne laser scans.
0.87 % 0.0025 [deg/m] 0.03 s 4 cores @ 2.5 Ghz (C/C++)
K. Ji and T. Huiyan Chen: CPFG-SLAM:a robust Simultaneous Localization and Mapping based on LIDAR in off-road environment. IEEE Intelligent Vehicles Symposium (IV) 2018.
37 SOFT
This method uses stereo information.
0.88 % 0.0022 [deg/m] 0.1 s 2 cores @ 2.5 Ghz (C/C++)
I. Cvišić and I. Petrović: Stereo odometry based on careful feature selection and tracking. European Conference on Mobile Robots (ECMR) 2015.
38 RotRocc
This method uses stereo information.
0.88 % 0.0025 [deg/m] 0.3 s 2 cores @ 2.0 Ghz (C/C++)
M. Buczko and V. Willert: Flow-Decoupled Normalized Reprojection Error for Visual Odometry. 19th IEEE Intelligent Transportation Systems Conference (ITSC) 2016.
39 D3VO 0.88 % 0.0021 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
N. Yang, L. Stumberg, R. Wang and D. Cremers: D3VO: Deep Depth, Deep Pose and Deep Uncertainty for Monocular Visual Odometry. The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2020.
40 SD-DEVO
This method makes use of Velodyne laser scans.
0.88 % 0.0028 [deg/m] 0.06 s 1 cores @ 3.6 Ghz (C/C++)
41 PNDT LO
This method makes use of Velodyne laser scans.
0.89 % 0.0030 [deg/m] 0.2 s 8 cores @ 3.5 Ghz (C/C++)
H. Hong and B. Lee: Probabilistic normal distributions transform representation for accurate 3d point cloud registration. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2017.
42 DVSO 0.90 % 0.0021 [deg/m] 0.1 s GPU @ 2.5 Ghz (C/C++)
N. Yang, R. Wang, J. Stueckler and D. Cremers: Deep Virtual Stereo Odometry: Leveraging Deep Depth Prediction for Monocular Direct Sparse Odometry. European Conference on Computer Vision (ECCV) 2018.
43 MULLS-test 0.92 % 0.0023 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
44 MULLS-test2 0.92 % 0.0023 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
45 LIMO
This method makes use of Velodyne laser scans.
code 0.93 % 0.0026 [deg/m] 0.2 s 2 cores @ 2.5 Ghz (C/C++)
J. Graeter, A. Wilczynski and M. Lauer: LIMO: Lidar-Monocular Visual Odometry. ArXiv e-prints 2018.
46 Stereo DSO
This method uses stereo information.
0.93 % 0.0020 [deg/m] 0.1 s 1 core @ 3.4 Ghz (C/C++)
R. Wang, M. Schw\"orer and D. Cremers: Stereo dso: Large-scale direct sparse visual odometry with stereo cameras. International Conference on Computer Vision (ICCV), Venice, Italy 2017.
47 IsaacElbrusGPUSLAM
This method uses stereo information.
0.94 % 0.0019 [deg/m] 0.007 s Jetson AGX
A. Korovko, D. Robustov, D. Slepichev, E. Vendrovsky and S. Volodarskiy: Realtime Stereo Visual Odometry. .
48 OV2SLAM
This method uses stereo information.
code 0.94 % 0.0023 [deg/m] 0.01 s 1 core @ 2.5 Ghz (C/C++)
M. Ferrera, A. Eudes, J. Moras, M. Sanfourche and G. Le Besnerais: OV2SLAM : A Fully Online and Versatile Visual SLAM for Real-Time Applications. IEEE Robotics and Automation Letters 2021.
49 PLO
This method makes use of Velodyne laser scans.
0.95 % 0.0021 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
50 OV2SLAM
This method uses stereo information.
code 0.98 % 0.0023 [deg/m] 0.01 s 8 cores @ 3.0 Ghz (C/C++)
M. Ferrera, A. Eudes, J. Moras, M. Sanfourche and G. Le Besnerais: OV2SLAM : A Fully Online and Versatile Visual SLAM for Real-Time Applications. IEEE Robotics and Automation Letters 2021.
51 ROCC
This method uses stereo information.
0.98 % 0.0028 [deg/m] 0.3 s 2 cores @ 2.0 Ghz (C/C++)
M. Buczko and V. Willert: How to Distinguish Inliers from Outliers in Visual Odometry for High-speed Automotive Applications. IEEE Intelligent Vehicles Symposium (IV) 2016.
52 SuMa-MOS
This method makes use of Velodyne laser scans.
code 0.99 % 0.0033 [deg/m] 0.1s 1 core @ 2.5 Ghz (C/C++)
X. Chen, S. Li, B. Mersch, L. Wiesmann, J. Gall, J. Behley and C. Stachniss: Moving Object Segmentation in 3D LiDAR Data: A Learning-based Approach Exploiting Sequential Data. IEEE Robotics and Automation Letters (RA-L) 2021.
53 IsaacElbrusSLAM
This method uses stereo information.
1.02 % 0.0019 [deg/m] 0.018 s 3 cores @ 3.3 Ghz (C/C++)
A. Korovko, D. Robustov, D. Slepichev, E. Vendrovsky and S. Volodarskiy: Realtime Stereo Visual Odometry. .
54 MESVO 1.03 % 0.0033 [deg/m] 0.5 s 1 core @ 2.5 Ghz (C/C++)
55 SuMa++
This method makes use of Velodyne laser scans.
code 1.06 % 0.0034 [deg/m] 0.1 s 1 core @ 3.5 Ghz (C/C++)
X. Chen, A. Milioto, E. Palazzolo, P. Gigu\`ere, J. Behley and C. Stachniss: SuMa++: Efficient LiDAR-based Semantic SLAM. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019.
56 ULF-ESGVI 1.07 % 0.0036 [deg/m] 0.3 s GPU and CPU @ 2.2 Ghz (Python + C/C++)
D. Yoon, H. Zhang, M. Gridseth, H. Thomas and T. Barfoot: Unsupervised Learning of Lidar Features for Use in a Probabilistic Trajectory Estimator. IEEE Robotics and Automation Letters (RAL) 2021.
57 cv4xv1-sc
This method uses stereo information.
1.09 % 0.0029 [deg/m] 0.145 s GPU @ 3.5 Ghz (C/C++)
M. Persson, T. Piccini, R. Mester and M. Felsberg: Robust Stereo Visual Odometry from Monocular Techniques. IEEE Intelligent Vehicles Symposium 2015.
58 VINS-Fusion
This method uses stereo information.
code 1.09 % 0.0033 [deg/m] 0.1s 1 core @ 3.0 Ghz (C/C++)
T. Qin, J. Pan, S. Cao and S. Shen: A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors. 2019.
59 MonoROCC
This method uses stereo information.
1.11 % 0.0028 [deg/m] 1 s 2 cores @ 2.0 Ghz (C/C++)
M. Buczko and V. Willert: Monocular Outlier Detection for Visual Odometry. IEEE Intelligent Vehicles Symposium (IV) 2017.
60 icp_lo 1.14 % 0.0037 [deg/m] 0.05 s 1 core @ 2.5 Ghz (C/C++)
61 DEMO
This method makes use of Velodyne laser scans.
1.14 % 0.0049 [deg/m] 0.1 s 2 cores @ 2.5 Ghz (C/C++)
J. Zhang, M. Kaess and S. Singh: Real-time Depth Enhanced Monocular Odometry. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2014.
62 ORB-SLAM2
This method uses stereo information.
code 1.15 % 0.0027 [deg/m] 0.06 s 2 cores @ >3.5 Ghz (C/C++)
R. Mur-Artal and J. Tard\'os: ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras. IEEE Transactions on Robotics 2017.
63 IV-SLAM
This method uses stereo information.
code 1.17 % 0.0025 [deg/m] 0.1 s GPU @ 2.5 Ghz (C/C++)
S. Rabiee and J. Biswas: IV-SLAM: Introspective Vision for Simultaneous Localization and Mapping. Conference on Robot Learning (CoRL) 2020.
64 NOTF
This method uses stereo information.
1.17 % 0.0035 [deg/m] 0.45 s 1 core @ 3.0 Ghz (C/C++)
J. Deigmoeller and J. Eggert: Stereo Visual Odometry without Temporal Filtering. German Conference on Pattern Recognition (GCPR) 2016.
65 S-PTAM
This method uses stereo information.
code 1.19 % 0.0025 [deg/m] 0.03 s 4 cores @ 3.0 Ghz (C/C++)
T. Pire, T. Fischer, G. Castro, P. De Crist\'oforis, J. Civera and J. Jacobo Berlles: S-PTAM: Stereo Parallel Tracking and Mapping. Robotics and Autonomous Systems (RAS) 2017.
T. Pire, T. Fischer, J. Civera, P. Crist\'{o}foris and J. Jacobo-Berlles: Stereo parallel tracking and mapping for robot localization. IROS 2015.
66 S-LSD-SLAM
This method uses stereo information.
code 1.20 % 0.0033 [deg/m] 0.07 s 1 core @ 3.5 Ghz (C/C++)
J. Engel, J. St\"uckler and D. Cremers: Large-Scale Direct SLAM with Stereo Cameras. Int.~Conf.~on Intelligent Robot Systems (IROS) 2015.
67 VoBa
This method uses stereo information.
1.22 % 0.0029 [deg/m] 0.1 s 1 core @ 2.0 Ghz (C/C++)
J. Tardif, M. George, M. Laverne, A. Kelly and A. Stentz: A new approach to vision-aided inertial navigation. 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, October 18-22, 2010, Taipei, Taiwan 2010.
68 STEAM-L WNOJ
This method makes use of Velodyne laser scans.
1.22 % 0.0058 [deg/m] 0.2 s 1 core @ 2.5 Ghz (C/C++)
T. Tang, D. Yoon and T. Barfoot: A White-Noise-On-Jerk Motion Prior for Continuous-Time Trajectory Estimation on SE (3). arXiv preprint arXiv:1809.06518 2018.
69 LiViOdo
This method makes use of Velodyne laser scans.
1.22 % 0.0042 [deg/m] 0.5 s 1 core @ 2.5 Ghz (C/C++)
J. Graeter, A. Wilczynski and M. Lauer: LIMO: Lidar-Monocular Visual Odometry. ArXiv e-prints 2018.
70 oneslam
This method uses stereo information.
1.24 % 0.0033 [deg/m] 0.05 s 1 core @ 2.4 Ghz (C/C++)
71 cslam
This method uses stereo information.
1.25 % 0.0034 [deg/m] 0.09 1 core @ 2.5 Ghz (C/C++)
72 SLUP
This method uses stereo information.
1.25 % 0.0041 [deg/m] 0.17 s 4 cores @ 3.3 Ghz (C/C++)
X. Qu, B. Soheilian and N. Paparoditis: Landmark based localization in urban environment. ISPRS Journal of Photogrammetry and Remote Sensing 2017.
73 STEAM-L
This method makes use of Velodyne laser scans.
1.26 % 0.0061 [deg/m] 0.2 s 1 core @ 2.5 Ghz (C/C++)
T. Tang, D. Yoon, F. Pomerleau and T. Barfoot: Learning a Bias Correction for Lidar- only Motion Estimation. 15th Conference on Computer and Robot Vision (CRV) 2018.
74 FRVO
This method uses stereo information.
1.26 % 0.0038 [deg/m] 0.03 s 1 core @ 3.5 Ghz (C/C++)
W. Meiqing, L. Siew-Kei and S. Thambipillai: A Framework for Fast and Robust Visual Odometry. IEEE Transaction on Intelligent Transportation Systems 2017.
75 circleSLAM
This method uses stereo information.
1.28 % 0.0033 [deg/m] 0.05 s 1 core @ 2.5 Ghz (C/C++)
76 MFI
This method uses stereo information.
1.30 % 0.0030 [deg/m] 0.1 s 1 core @ 2.2 Ghz (C/C++)
H. Badino, A. Yamamoto and T. Kanade: Visual Odometry by Multi-frame Feature Integration. First International Workshop on Computer Vision for Autonomous Driving at ICCV 2013.
77 SF 1.34 % 0.0032 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
78 TLBBA
This method uses stereo information.
1.36 % 0.0038 [deg/m] 0.1 s 1 Core @2.8GHz (C/C++)
W. Lu, Z. Xiang and J. Liu: High-performance visual odometry with two- stage local binocular BA and GPU. Intelligent Vehicles Symposium (IV), 2013 IEEE 2013.
79 2FO-CC
This method uses stereo information.
code 1.37 % 0.0035 [deg/m] 0.1 s 1 core @ 3.0 Ghz (C/C++)
I. Krešo and S. Šegvić: Improving the Egomotion Estimation by Correcting the Calibration Bias. VISAPP 2015.
80 SALO
This method makes use of Velodyne laser scans.
1.37 % 0.0051 [deg/m] 0.6 s 1 core @ 2.5 Ghz (C/C++)
D. Kovalenko, M. Korobkin and A. Minin: Sensor Aware Lidar Odometry. 2019 European Conference on Mobile Robots (ECMR) 2019.
81 SuMa
This method makes use of Velodyne laser scans.
1.39 % 0.0034 [deg/m] 0.1 s 1 core @ 3.5 Ghz (C/C++)
J. Behley and C. Stachniss: Efficient Surfel-Based SLAM using 3D Laser Range Data in Urban Environments. Robotics: Science and Systems (RSS) 2018.
82 ProSLAM
This method uses stereo information.
code 1.39 % 0.0035 [deg/m] 0.02 s 1 core @ 3.0 Ghz (C/C++)
D. Schlegel, M. Colosi and G. Grisetti: ProSLAM: Graph SLAM from a Programmer's Perspective. ArXiv e-prints 2017.
83 ComboVLO 1.41 % 0.0049 [deg/m] 1 s 16 cores @ >3.5 Ghz (C/C++)
84 ESVO 1.42 % 0.0048 [deg/m] 1 s 1 core @ 2.5 Ghz (C/C++)
H. Nguyen, T. Nguyen, C. Tran, K. Phung and Q. Nguyen: A novel translation estimation for essential matrix based stereo visual odometry. 2021 15th International Conference on Ubiquitous Information Management and Communication (IMCOM) 2021.
85 ProRPG
This method uses stereo information.
1.43 % 0.0037 [deg/m] 0.02 s 1 core @ 2.5 Ghz (C/C++)
86 JFBVO
This method uses stereo information.
1.43 % 0.0038 [deg/m] 0.05 s 1 core @ 3.4 Ghz (C/C++)
R. Sardana, R. Kottath, V. Karar and S. Poddar: Joint Forward-Backward Visual Odometry for Stereo Cameras. Proceedings of the Advances in Robotics 2019 2019.
87 LNSLAM
This method uses stereo information.
1.44 % 0.0037 [deg/m] 0.02 s 1 core @ 2.5 Ghz (C/C++)
88 ONSLAM
This method uses stereo information.
1.44 % 0.0037 [deg/m] 0.02 s 1 core @ 2.5 Ghz (C/C++)
89 LCSLAM
This method uses stereo information.
1.47 % 0.0039 [deg/m] 0.02 s 1 core @ 2.5 Ghz (C/C++)
90 OFL-SLAM
This method uses stereo information.
This method makes use of Velodyne laser scans.
1.47 % 0.0033 [deg/m] 0.1 s 4 cores @ 2.5 Ghz (C/C++)
91 StereoSFM
This method uses stereo information.
code 1.51 % 0.0042 [deg/m] 0.02 s 2 cores @ 2.5 Ghz (C/C++)
H. Badino and T. Kanade: A Head-Wearable Short-Baseline Stereo System for the Simultaneous Estimation of Structure and Motion. IAPR Conference on Machine Vision Application 2011.
92 SSLAM
This method uses stereo information.
code 1.57 % 0.0044 [deg/m] 0.5 s 8 cores @ 3.5 Ghz (C/C++)
F. Bellavia, M. Fanfani, F. Pazzaglia and C. Colombo: Robust Selective Stereo SLAM without Loop Closure and Bundle Adjustment. ICIAP 2013 2013.
F. Bellavia, M. Fanfani and C. Colombo: Selective visual odometry for accurate AUV localization. Autonomous Robots 2015.
M. Fanfani, F. Bellavia and C. Colombo: Accurate Keyframe Selection and Keypoint Tracking for Robust Visual Odometry. Machine Vision and Applications 2016.
93 VOLDOR code 1.65 % 0.0050 [deg/m] 0.1 s GPU
Z. Min, Y. Yang and E. Dunn: VOLDOR: Visual Odometry From Log-Logistic Dense Optical Flow Residuals. IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020.
94 eVO
This method uses stereo information.
1.76 % 0.0036 [deg/m] 0.05 s 2 cores @ 2.0 Ghz (C/C++)
M. Sanfourche, V. Vittori and G. Besnerais: eVO: A realtime embedded stereo odometry for MAV applications. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2013.
95 Stereo DWO
This method uses stereo information.
code 1.76 % 0.0026 [deg/m] 0.1 s 4 cores @ 2.5 Ghz (C/C++)
J. Huai, C. Toth and D. Grejner-Brzezinska: Stereo-inertial odometry using nonlinear optimization. Proceedings of the 27th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS+ 2015) 2015.
96 BVO 1.76 % 0.0036 [deg/m] 0.1 s 1 core @ 2.5GHz (Python)
F. Pereira, J. Luft, G. Ilha, A. Sofiatti and A. Susin: Backward Motion for Estimation Enhancement in Sparse Visual Odometry. 2017 Workshop of Computer Vision (WVC) 2017.
97 3DOF-SLAM code 1.89 % 0.0083 [deg/m] 0.02 s 1 core @ 2.5 Ghz (C/C++)
M. Dimitrievski., D. Hamme., P. Veelaert. and W. Philips.: Robust Matching of Occupancy Maps for Odometry in Autonomous Vehicles. Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016) 2016.
98 SMTD-LO
This method makes use of Velodyne laser scans.
code 2.01 % 0.0030 [deg/m] 0.3 s 1 core @ 2.5 Ghz (C/C++)
99 D6DVO
This method uses stereo information.
2.04 % 0.0051 [deg/m] 0.03 s 1 core @ 2.5 Ghz (C/C++)
A. Comport, E. Malis and P. Rives: Accurate Quadrifocal Tracking for Robust 3D Visual Odometry. ICRA 2007.
M. Meilland, A. Comport and P. Rives: Dense visual mapping of large scale environments for real-time localisation. ICRA 2011.
100 PMO / PbT-M2 2.05 % 0.0051 [deg/m] 1 s 1 core @ 2.5 Ghz (Python + C/C++)
N. Fanani, A. Stuerck, M. Ochs, H. Bradler and R. Mester: Predictive monocular odometry (PMO): What is possible without RANSAC and multiframe bundle adjustment?. Image and Vision Computing 2017.
101 GFM code 2.12 % 0.0056 [deg/m] 0.03 s 2 cores @ 1.5 Ghz (C/C++)
Y. Zhao and P. Vela: Good Feature Matching: Towards Accurate, Robust VO/VSLAM with Low Latency. submitted to IEEE Transactions on Robotics 2019.
102 SSLAM-HR
This method uses stereo information.
code 2.14 % 0.0059 [deg/m] 0.5 s 8 cores @ 3.5 Ghz (C/C++)
F. Bellavia, M. Fanfani, F. Pazzaglia and C. Colombo: Robust Selective Stereo SLAM without Loop Closure and Bundle Adjustment. ICIAP 2013 2013.
F. Bellavia, M. Fanfani and C. Colombo: Selective visual odometry for accurate AUV localization. Autonomous Robots 2015.
M. Fanfani, F. Bellavia and C. Colombo: Accurate Keyframe Selection and Keypoint Tracking for Robust Visual Odometry. Machine Vision and Applications 2016.
103 FTMVO 2.24 % 0.0049 [deg/m] 0.11 s 1 core @ 2.5 Ghz (C/C++)
H. Mirabdollah and B. Mertsching: Fast Techniques for Monocular Visual Odometry . Proceeding of 37th German Conference on Pattern Recognition (GCPR) 2015 .
104 SMLI code 2.28 % 0.0073 [deg/m] 0.1 s 4 cores @ >3.5 Ghz (C/C++)
105 PbT-M1 2.38 % 0.0053 [deg/m] 1 s 1 core @ 2.5 Ghz (Python + C/C++)
N. Fanani, M. Ochs, H. Bradler and R. Mester: Keypoint trajectory estimation using propagation based tracking. Intelligent Vehicles Symposium (IV) 2016.
N. Fanani, A. Stuerck, M. Barnada and R. Mester: Multimodal scale estimation for monocular visual odometry. Intelligent Vehicles Symposium (IV) 2017.
106 FLVIS
This method uses stereo information.
code 2.42 % 0.0057 [deg/m] 0.05 s 2 cores @ 2.5 Ghz (C/C++)
S. Chen, C. Wen, Y. Zou and W. Chen: Stereo visual inertial pose estimation based on feedforward-feedback loops. arXiv preprint arXiv:2007.02250 2020.
107 VISO2-S
This method uses stereo information.
code 2.44 % 0.0114 [deg/m] 0.05 s 1 core @ 2.5 Ghz (C/C++)
A. Geiger, J. Ziegler and C. Stiller: StereoScan: Dense 3d Reconstruction in Real-time. IV 2011.
108 MLM-SFM 2.54 % 0.0057 [deg/m] 0.03 s 5 cores @ 2.5 Ghz (C/C++)
S. Song and M. Chandraker: Robust Scale Estimation in Real-Time Monocular SFM for Autonomous Driving. CVPR 2014.
S. Song, M. Chandraker and C. Guest: Parallel, Real-time Monocular Visual Odometry. ICRA 2013.
109 GT_VO3pt
This method uses stereo information.
2.54 % 0.0078 [deg/m] 1.26 s 1 core @ 2.5 Ghz (C/C++)
C. Beall, B. Lawrence, V. Ila and F. Dellaert: 3D reconstruction of underwater structures. IROS 2010.
110 RMCPE+GP 2.55 % 0.0086 [deg/m] 0.39 s 1 core @ 2.5 Ghz (C/C++)
M. Mirabdollah and B. Mertsching: On the Second Order Statistics of Essential Matrix Elements. Proceeding of 36th German Conference on Pattern Recognition 2014.
111 KLTVO
This method uses stereo information.
2.63 % 0.0042 [deg/m] 0.1 s 1 core @ 3.0 Ghz (C/C++)
N. Dias and G. Laureano: Accurate Stereo Visual Odometry Based on Keypoint Selection. 2019 Latin American Robotics Symposium (LARS), 2019 Brazilian Symposium on Robotics (SBR) and 2019 Workshop on Robotics in Education (WRE) 2019.
112 VO3pt
This method uses stereo information.
2.69 % 0.0068 [deg/m] 0.56 s 1 core @ 2.0 Ghz (C/C++)
P. Alcantarilla: Vision Based Localization: From Humanoid Robots to Visually Impaired People. 2011.
P. Alcantarilla, J. Yebes, J. Almazán and L. Bergasa: On Combining Visual SLAM and Dense Scene Flow to Increase the Robustness of Localization and Mapping in Dynamic Environments. ICRA 2012.
113 TGVO
This method uses stereo information.
2.94 % 0.0077 [deg/m] 0.06 s 1 core @ 2.5 Ghz (C/C++)
B. Kitt, A. Geiger and H. Lategahn: Visual Odometry based on Stereo Image Sequences with RANSAC-based Outlier Rejection Scheme. IV 2010.
114 VO3ptLBA
This method uses stereo information.
3.13 % 0.0104 [deg/m] 0.57 s 1 core @ 2.0 Ghz (C/C++)
P. Alcantarilla: Vision Based Localization: From Humanoid Robots to Visually Impaired People. 2011.
P. Alcantarilla, J. Yebes, J. Almazán and L. Bergasa: On Combining Visual SLAM and Dense Scene Flow to Increase the Robustness of Localization and Mapping in Dynamic Environments. ICRA 2012.
115 PLSVO
This method uses stereo information.
3.26 % 0.0095 [deg/m] 0.20 s 2 cores @ 2.5 Ghz (C/C++)
R. Gomez-Ojeda and J. Gonzalez- Jimenez: Robust Stereo Visual Odometry through a Probabilistic Combination of Points and Line Segments. Robotics and Automation (ICRA), 2016 IEEE International Conference on 2016.
116 BLF 3.49 % 0.0128 [deg/m] 0.7 s 1 core @ 2.5 Ghz (C/C++)
M. Velas, M. Spanel, M. Hradis and A. Herout: CNN for IMU Assisted Odometry Estimation using Velodyne LiDAR. ArXiv e-prints 2017.
117 CFORB
This method uses stereo information.
3.73 % 0.0107 [deg/m] 0.9 s 8 cores @ 3.0 Ghz (C/C++)
D. Mankowitz and E. Rivlin: CFORB: Circular FREAK-ORB Visual Odometry. arXiv preprint arXiv:1506.05257 2015.
118 DeepCLR
This method makes use of Velodyne laser scans.
code 3.83 % 0.0104 [deg/m] 0.05 s GPU @ 1.0 Ghz (Python)
M. Horn, N. Engel, V. Belagiannis, M. Buchholz and K. Dietmayer: DeepCLR: Correspondence-Less Architecture for Deep End-to-End Point Cloud Registration. 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC) 2020.
119 VOFS
This method uses stereo information.
3.94 % 0.0099 [deg/m] 0.51 s 1 core @ 2.0 Ghz (C/C++)
M. Kaess, K. Ni and F. Dellaert: Flow separation for fast and robust stereo odometry. ICRA 2009.
P. Alcantarilla, L. Bergasa and F. Dellaert: Visual Odometry priors for robust EKF-SLAM. ICRA 2010.
120 CPC
This method uses stereo information.
4.01 % 0.0100 [deg/m] 0.10 s 1 core @ 2.5 Ghz (Python)
121 CPC
This method uses stereo information.
4.01 % 0.0100 [deg/m] 0.10 s 1 core @ 2.5 Ghz (Python)
122 DeepAVO 4.10 % 0.0125 [deg/m] 0.01 s GPU @ 3.0 Ghz (Python)
123 VOFSLBA
This method uses stereo information.
4.17 % 0.0112 [deg/m] 0.52 s 1 core @ 2.0 Ghz (C/C++)
M. Kaess, K. Ni and F. Dellaert: Flow separation for fast and robust stereo odometry. ICRA 2009.
P. Alcantarilla, L. Bergasa and F. Dellaert: Visual Odometry priors for robust EKF-SLAM. ICRA 2010.
124 DHVO 4.28 % 0.0043 [deg/m] 0.05 s 1 core @ 2.5 Ghz (Python + C/C++)
125 CUDA-EgoMotion 4.36 % 0.0052 [deg/m] .001 s GPU @ 2.5 Ghz (Matlab)
A. Aguilar-González, M. Arias- Estrada, F. Berry and J. Osuna-Coutiño: The Fastest Visual Ego-motion Algorithm in the West. Microprocessors and Microsystems 2019.
126 vo 4.57 % 0.0153 [deg/m] 0.05 s 1 core @ 2.5 Ghz (C/C++)
127 BCC 4.59 % 0.0175 [deg/m] 1 s 1 core @ 2.5 Ghz (C/C++)
M. Velas, M. Spanel, M. Hradis and A. Herout: CNN for IMU Assisted Odometry Estimation using Velodyne LiDAR. ArXiv e-prints 2017.
128 D3DLO 5.40 % 0.0154 [deg/m] 0.1 s GPU @ 2.5 Ghz (Python)
P. Adis, N. Horst and M. Wien: D3DLO: Deep 3D LiDAR Odometry. 2021.
129 EB3DTE+RJMCM 5.45 % 0.0274 [deg/m] 1 s 1 core @ 2.5 Ghz (Matlab)
Z. Boukhers, K. Shirahama and M. Grzegorzek: Example-based 3D Trajectory Extraction of Objects from 2D Videos. Circuits and Systems for Videos Technology (TCSVT), IEEE Transaction on 2017.
Z. Boukhers, K. Shirahama and M. Grzegorzek: Less restrictive camera odometry estimation from monocular camera. Multimedia Tools and Applications 2017.
130 SOPVO code 5.73 % 0.0231 [deg/m] 0.01 s 8 cores @ 3.0 Ghz (C/C++)
131 LTMVO 7.40 % 0.0142 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
Y. Zou, P. Ji, Q. Tran, J. Huang and M. Chandraker: Learning Monocular Visual Odometry via Self-Supervised Long-Term Modeling. ECCV 2020.
132 VISO2-M + GP 7.46 % 0.0245 [deg/m] 0.15 s 1 core @ 2.5 Ghz (C/C++)
A. Geiger, J. Ziegler and C. Stiller: StereoScan: Dense 3d Reconstruction in Real-time. IV 2011.
S. Song and M. Chandraker: Robust Scale Estimation in Real-Time Monocular SFM for Autonomous Driving. CVPR 2014.
133 BLO 9.21 % 0.0163 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
M. Velas, M. Spanel, M. Hradis and A. Herout: CNN for IMU Assisted Odometry Estimation using Velodyne LiDAR. ArXiv e-prints 2017.
134 VISO2-M code 11.94 % 0.0234 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
A. Geiger, J. Ziegler and C. Stiller: StereoScan: Dense 3d Reconstruction in Real-time. IV 2011.
135 MonoDepth2 code 12.59 % 0.0312 [deg/m] 1 s 1 core @ 2.5 Ghz (C/C++)
C. Godard, O. Mac Aodha, M. Firman and G. Brostow: Digging into self-supervised monocular depth estimation. ICCV 2019.
136 MEGO 12.89 % 0.0451 [deg/m] 0.75 s 1 core @ 2.5 Ghz (C/C++)
137 SMD-LVO code 13.25 % 0.0097 [deg/m] 0.03 s GPU @ 2.5 Ghz (Python)
I. Slinko, A. Vorontsova, F. Konokhov, O. Barinova and A. Konushin: Scene Motion Decomposition for Learnable Visual Odometry. 2019.
138 SC-SfMLearner (cs+k) code 13.69 % 0.0355 [deg/m] 0.01 s 1 core @ 2.5 Ghz (C/C++)
J. Bian, Z. Li, N. Wang, H. Zhan, C. Shen, M. Cheng and I. Reid: Unsupervised scale-consistent depth and ego-motion learning from monocular video. NeurIPS 2019.
139 CC code 16.06 % 0.0320 [deg/m] 0.1 s 1 core @ 2.5 Ghz (C/C++)
A. Ranjan, V. Jampani, L. Balles, K. Kim, D. Sun, J. Wulff and M. Black: Competitive collaboration: Joint unsupervised learning of depth, camera motion, optical flow and motion segmentation. CVPR 2019.
140 OABA 20.95 % 0.0135 [deg/m] 0.5 s 1 core @ 3.5 Ghz (C/C++)
D. Frost, O. Kähler and D. Murray: Object-Aware Bundle Adjustment for Correcting Monocular Scale Drift. Proceedings of the International Conference on Robotics and Automation (ICRA) 2012.
141 SC-SfMLearner (k) code 21.47 % 0.0425 [deg/m] 0.01 s 1 core @ 2.5 Ghz (C/C++)
J. Bian, Z. Li, N. Wang, H. Zhan, C. Shen, M. Cheng and I. Reid: Unsupervised scale-consistent depth and ego-motion learning from monocular video. NeurIPS 2019.
Table as LaTeX | Only published Methods


Related Datasets

Citation

When using this dataset in your research, we will be happy if you cite us:
@INPROCEEDINGS{Geiger2012CVPR,
  author = {Andreas Geiger and Philip Lenz and Raquel Urtasun},
  title = {Are we ready for Autonomous Driving? The KITTI Vision Benchmark Suite},
  booktitle = {Conference on Computer Vision and Pattern Recognition (CVPR)},
  year = {2012}
}



eXTReMe Tracker