点线联合的优化视觉惯性里程计Point-and-line joint optimization visual inertial odometer
危双丰,师现杰,刘振彬,肖斌
摘要(Abstract):
为了减弱视觉同时定位与地图构建(SLAM)易受光照、纹理等条件的影响,在非线性化单目SLAM研究基础上,该文提出了一种加速度计bias估计优化初始化与点线特征结合的优化视觉里程计,使得在光线较弱的情况下也有较好的位姿估计效果,更优秀的初始化结果使得整个系统更加鲁棒、精度更高,且为了减少因为线特征的加入而增加的计算量,提出一种新的数据选择策略。通过和其他优秀算法(如PL-VIO、仅特征点方案)对比及真实场景实验的结果分析可知,该文提出的点线联合的优化视觉惯性里程计不仅能够减少定位误差,而且在光照条件较弱的环境中有较高的精度,既保证了系统的实时性,又提高了系统的鲁棒性。
关键词(KeyWords): 同时定位与地图构建(SLAM);线特征;非线性优化;初始化;鲁棒性
基金项目(Foundation): 国家自然科学基金项目(41601409);; 北京市自然科学基金资助项目(8172016);; 北京建筑大学市属高校基本科研业务费专项资金资助项目(X18229);北京建筑大学研究生创新项目(PG2018066,PG2019065,PG2019061)
作者(Author): 危双丰,师现杰,刘振彬,肖斌
DOI: 10.16251/j.cnki.1009-2307.2021.04.004
参考文献(References):
- [1]ZHANG J,SINGH S.Laser-visual-inertial odometry and mapping with high robustness and low drift[J].Journal of Field Robotics,2018,35(8):1242-1264.
- [2]危双丰,庞帆,刘振彬,等.基于激光雷达的同时定位与地图构建方法综述[J].计算机应用研究,2020,37(2):327-332.(WEI Shuangfeng,PANG Fan,LIU Zhenbin,et al.Survey of LiDAR-based SLAM algorithm[J].Computer Application Research,2020,37(2):327-332.)
- [3]LUPTON T,SUKKARIEH S.Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions[J].IEEE Transactions on Robotics,2012,28(1):61-76.
- [4]SHEN S,MICHAEL N,KUMAR V.Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs[C]∥2015IEEE International Conference on Robotics and Automation (ICRA).[S.l.]:IEEE,2015.
- [5]FORSTER C,CARLONE L,DELLAERT F,et al.IMU preintegration on manifold for efficient visualinertial maximum-a-posteriori estimation[C/OL]∥Robot Sci and Syst,2015[2020-05-23].https:∥files.ifi.uzh.ch/rpg/website/rpg.ifi.uzh.ch/html/docs/RSS15_Forster.pdf.
- [6]QIN T,LI P,SHEN S.VINS-Mono:a robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics,2018,34(4):1004-1020.
- [7]GIOI R G,JAKUBOWICZ J,MOREL J M,et al.LSD:a fast line segment detector with a false detection control[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2010,32(4):722-732.
- [8]李海丰,胡遵河,陈新伟.PLP-SLAM:基于点、线、面特征融合的视觉SLAM方法[J].机器人,2017,39(2):214-220,229.(LI Haifeng,HU Zunhe,CHEN Xinwei.PLP-SLAM:a visual SLAM method based on pointline-plane feature fusion[J].Robot,2017,39(2):214-220,229.)
- [9]PUMAROLA A,VAKHITOV A.PL-SLAM:real-time monocular visual SLAM with points and lines[C]∥IEEE International Conference on Robotics and Automation.Piscataway,USA:IEEE,2017:4503-4508.
- [10]KONG X L,WU W Q,ZHANG L L,et al.Tightlycoupled stereo visual-inertial navigation using point and line features[J].Sensors,2015,15(6):12816-12833.
- [11]HE Yijia,ZHAO Ji,GUO Yue,et al.PL-VIO:tightlycoupled monocular visual-inertial odometry using point and line features[J].Sensors,2018,18(4):1159.
- [12]LU Y,SONG D.Robust RGB-D odometry using point and line features[C]∥IEEE International Conference on Computer Vision.New York:IEEE,2015:3934-3942.
- [13]王丹,黄鲁,李垚.基于点线特征的单目视觉同时定位与地图构建算法[J].机器人,2019,41(3):392-403.(WANG Dan,HUANG Lu,LI Yao.A monocular visual slam algorithm based on point-line feature[J].Robot,2019,41(3):392-403.)
- [14]刘振彬,危双丰,庞帆,等.一种改进的非线性优化单目惯导SLAM方案[J/OL].测绘科学.[2020-05-23].http:∥kns.cnki.net/kcms/detail/11.4415.P.20190916.1346.002.html.(LIU Zhenbin,WEI Shuangfeng,PANGFan,et al.An improved monocular inertial SLAM based on nonlinear optimization[J/OL].Science of Surveying and Mapping.[2020-05-23].http:∥kns.cnki.net/kcms/detail/11.4415.P.20190916.1346.002.html.)
- [15]MUR-ARTAL R,MONTIEL J M M,TARDS J D.ORB-SLAM:a versatile and accurate monocular SLAMsystem[J].IEEE Transactions on Robotics,2017,31(5):1147-1163.
- [16]BARTOLI A,STURM P.The 3Dline motion matrix and alignment of line reconstructions[J].International Journal of Computer Vision,2004,57(3):159-178.
- [17]POLAP D,WLODARCZYK-SIELICKA M.Classification of non-conventional ships using a neural bag-of-words mechanism[J].Sensors,2020,20(6):1608.
- [18]张齐勋,刘宏志,刘诗祥.基于行业专有词典的TF-IDF特征选择算法改进[J].计算机应用与软件,2017,34(7):277-281.(ZHANG Qixun,LIU Hongzhi,LIUShixiang.Improved of TF-IDF feature selection algorithm based on industry proprietary dictionary[J].Computer Applications and Software,2017,34(7):277-281.)