zbMATH — the first resource for mathematics

Terrain-based vehicle orientation estimation combining vision and inertial measurements. (English) Zbl 1243.68288
Summary: A novel method for estimating vehicle roll, pitch, and yaw using machine vision and inertial sensors is presented that is based on matching images captured from an on-vehicle camera to a rendered representation of the surrounding terrain obtained from a three-dimensional (3D) terrain map. U.S. geographical survey digital elevation maps were used to create a 3D topology map of the geography surrounding the vehicle, and it is assumed in this work that large segments of the surrounding terrain are visible, particularly the horizon lines. The horizon lines seen in the captured video from the vehicle are compared to the horizon lines obtained from a rendered geography, allowing absolute comparisons between rendered and actual scene in roll, pitch, and yaw. A kinematic Kalman filter modeling an inertial navigation system then uses the scene matching to generate filtered estimates of orientation. Numerical simulations verify the performance of the Kalman filter. Experiments using an instrumented vehicle operating at the test track of the Pennsylvania Transportation Institute were performed to check the validity of the method. When compared to estimates from a global positioning system/inertial measurement unit (IMU) system, the roll, pitch, and yaw estimates from vision/IMU Kalman filter show an agreement with a ($$2\sigma$$) bound of 0.5, 0.26, and 0.8 deg, respectively.

MSC:
 68T40 Artificial intelligence for robotics 68T45 Machine vision and scene understanding
Full Text:
References:
 [1] Allan, Proceedings of the IEEE 54 pp 221– (1966) [2] Bao, IEEE Transactions on Instrumentation and Measurement 54 pp 1067– (2005) [3] (1999). Registration for an augmented reality system enhancing the situational awareness in an outdoor scenario (pp. 231–242). In Proceedings of SPIE Conference on Enhanced and Synthetic Vision, volume 3691. [4] Bevly, Journal of Dynamic Systems, Measurement and Control 126 pp 255– (2004) [5] (1971). Inertial navigation system analysis. New York: Wiley Interscience. [6] & (1992). Introduction to random signals and applied Kalman filtering. New York: Wiley. · Zbl 0759.93074 [7] (1997). Fundamentals of high accuracy inertial guidance. Reston, VA: American Institute of Aeronautics and Astronautics. [8] , , & (2004). A simultaneous localization and mapping algorithm based on Kalman filtering. In IEEE Intelligent Vehicles Symposium. [9] Chroust, Journal of Robotics Systems 21 pp 73– (2004) [10] & (1996). Position estimation from outdoor visual landmarks for teleoperation of lunar rovers (pp. 156–161). In Third IEEE Workshop on Applications of Computer Vision, Sarasota, FL. [11] & (1989). Recent developments in TERPROM. In Advances in techniques and technologies for air vehicle navigation and guidance. [12] Dissanayake, IEEE Transactions on Robotics and Automation 17 pp 731– (2001) [13] Edwards, Photogrammetria 43 pp 101– (1988) [14] & (1989). Image/map correspondence using curve matching. In AAAI Robot Navigation Symposium, Stanford, CA. [15] EROS (2006). Seamless data distribution system, earth resources observation and science (EROS). http://seamless.usgs.gov/website/seamless/. [16] , & (2004). A review of real-time terrain rendering techniques (pp. 685–691). In Proceedings of Computer Supported Cooperative Work in Design (CSCWD), volume 1, Xiamen, China. [17] Freeman, Journal of the Franklin Institute 285 pp 1– (1967) [18] Frese, Autonomous Robots 20 pp 25– (2006) [19] & (2001). Fast 3D model generation in urban environments (pp. 165–170). International Conference on Multisensor Fusion and Integration for Intelligent Systems. [20] Furst, Robotics and Autonomous Systems 28 pp 173– (1999) [21] Gebre-Egziabher, IEEE Transactions on Aerospace and Electronic Systems 40 pp 627– (2004) [22] (1980). Terrain contour matching (tercom): A cruise missile guidance aid (pp. 10–18). In Proceedings of the Seminar of Image Processing for Missile Guidance, volume 238, San Diego, CA. [23] , , , & (2005). Vehicle-borne scanning for detailed 3D terrain model generation (SAE Technical Paper 2005-01-3557). In SAE (Society of Automotive Engineers) Commercial Vehicle Engineering Congress, Chicago, IL. [24] Guivant, IEEE Transactions on Robotics and Automation 17 pp 242– (2001) [25] , & (2001). Sensor fusion for mobile robot dead-reckoning with a precision-calibrated fiber optic gyroscope (pp. 3588–3593). In IEEE International Conference on Robotics and Automation, volume 4, Seoul, Korea. [26] & (1997). A four-step camera calibration procedure with implicit image correction (pp. 1106–1112). In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Juan, PR. [27] Hoffman, Autonomous Robots 6 pp 113– (1999) [28] Horn, International Journal of Computer Vision 4 pp 59– (1990) [29] Hostetler, IEEE Transactions of Automatic Control 28 pp 315– (1983) [30] & (2004). Real-time data fusion on stabilizing camera pose estimation output for vision-based road navigation (pp. 480–490). In Proceedings of SPIE Stereoscopic Displays and Virtual Reality Systems XI, volume 5291. [31] (1993). Modern inertial technology. New York: Springer Verlag. [32] , & (2004). Pose estimation using feature correspondences and DTM (pp. 2603–2606). In International Conference on Image Processing, volume 4, Singapore. [33] & (1981). An iterative image registration technique with an application to stereo vision (pp. 674–679). In Proceedings of 7th International Joint Conference on Artificial Intelligence. [34] , , , & (1996). A terrain referenced underwater positioning using sonar bathymetric profiles and multiscale analysis (pp. 417–421). In Proceedings of MTS/IEEE OCEANS ’96 ’Prospects for the 21st Century’ Conference, volume 1, Fort Lauderdale, FL. [35] Madhavan, Automation in Construction 13 pp 83– (2004) [36] Ng, Journal of Guidance, Control, and Dynamics 20 pp 211– (1997) [37] (1962). Inertial guidance. New York: Wiley. · Zbl 0115.18001 [38] Rehbinder, IEEE Transactions on Automatic Control 48 pp 186– (2003) [39] Rodriguez, IEEE Transactions on Pattern Analysis and Machine Intelligence 12 pp 1138– (1990) [40] (2000). Applied mathematics in integrated navigation systems. Reston, VA: American Institute of Aeronautics and Astronautics. [41] Ryu, Journal of Dynamic Systems, Measurements and Control 126 pp 243– (2004) [42] SAE J670e (1978). Vehicle dynamics terminology. Warrendale, PA: Society of Automotive Engineers. [43] Schwartz, International Journal of Robotics Research 6 pp 29– (1987) [44] SRTM (2006). Shuttle Radar Topography Mission. http://srtm.usgs.gov/. [45] , & (1999). Multibeam sonar image matching for terrain-based underwater navigation (pp. 882–887). In MTS/IEEE OCEANS ’99—Riding the crest into 21st century, volume 2, Seattle, WA. [46] Strelow, International Journal of Robotics Research 23 pp 1157– (2004) [47] Talluri, IEEE Transactions on Robotics and Automation 8 pp 573– (1992) [48] Wolfson, IEEE Transactions on Pattern Analysis and Machine Intelligence 12 pp 483– (1990) [49] , , & (2005). Robust horizon and peak extraction for vision-based navigation. In IAPR Conference on Machine Vision Applications, Tsukuba Science City, Japan. [50] & (2001). Fusion of vision and gyro tracking for robust augmented reality registration (pp. 71–78). In Proceedings of the IEEE Virtual Reality Conference, Yokohama, Japan. [51] , , , & (2004). Simulation design of underwater terrain matching navigation based on information fusion (pp. 3114–3117). In Proceedings of the 2004 IEEE International Geoscience and Remote Sensing Symposium (IGARSS ’04), volume 5.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.