研究生: |
張芷瑛 CHANG, JR-YING |
---|---|
論文名稱: |
人形機器人之輕量化單眼視覺羅盤 Lightweight Monocular Visual Compass for Humanoid Robot |
指導教授: |
包傑奇
Baltes, Jacky |
口試委員: | 郭重顯 許陳鑑 包傑奇 |
口試日期: | 2021/01/25 |
學位類別: |
碩士 Master |
系所名稱: |
電機工程學系 Department of Electrical Engineering |
論文出版年: | 2021 |
畢業學年度: | 109 |
語文別: | 英文 |
論文頁數: | 36 |
英文關鍵詞: | humanoid robot, visual compass, monocular camera, lightweight, FIRA HuroCup |
DOI URL: | http://doi.org/10.6345/NTNU202101202 |
論文種類: | 學術論文 |
相關次數: | 點閱:108 下載:11 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
Humanoid robots have become an emerging and challenging research field and played a central role in robotics research and many applications during these years. Generally, when performing specific tasks in the real world, the robot needs to Estimate its Orientation From the Environment (EOFE). A visual compass (VC) is the typical method to solve the EOFE problem. However, when using a humanoid robot as a platform in the EOFE problem to implement a VC algorithm, there are some limitations: First, the camera used as a sensor is sensitive to strong motion blur. Second, with the robot's physical structure restriction, the available CPU will be relatively constricted. That is, the problem should be how to apply a camera-based visual compass with satisfactory performance on a humanoid robot at low cost. Therefore, we proposed a simplified visual algorithm based on the appearance, which can analyze the pixel distribution correlation of the image to estimate the robot's current orientation to correct its movement deviation. On the other hand, we also proposed a simplified landmark-based visual compass which uses a target object as a reference point in the environment to estimate the movement angle. Both of these proposed algorithms take less time in computation, the HCVC can process 137.737 fps, and the DCTVC can process 166.818 fps. Also, the two algorithms have satisfactory performances: the HCVC has the minimum mean-square-error (MSE) of 0.20917 degrees, and the DCTVC has 0.21713 degrees.
[1] S. Kajita, H. Hirukawa, K. Harada, and K. Yokoi, Introduction to humanoid robotics. Springer, 2014.
[2] K. Hirai, M. Hirose, Y. Haikawa and T. Takenaka, “The Development of Honda Humanoid Robot,” in Proc. of IEEE Int. Conference on Robotics and Automation, 1998, 1321-1326. 2.
[3] M. Vukobratovic, B. Borovac and K. Babkovic, “Contribution to the study of Anthropomorphism of Humanoid Robots,” Humanoids Robotics J., Vol. 2, No.3, 2005, pp. 361-387.
[4] Premebida, Cristiano, Rares Ambrus, and Zoltan-Csaba Marton. "Intelligent robotic perception systems." Applications of Mobile Robots. IntechOpen, 2018.
[5] Manon Kok, Jeroen D. Hol and Thomas B. Schön (2017), "Using Inertial Sensors for Position and Orientation Estimation", Foundations and Trends® in Signal Processing: Vol. 11: No. 1-2, pp 1-153.
[6] M.J. Gallant, J.A. Marshall, B.K. Lynch, Estimating the heading of a Husky mobile robot with a LiDAR compass based on direction maps, in: Proc. 10th International Conference on Intelligent Unmanned Systems, Montreal, Canada, 2014.
[7] Aqel, M.O.A., Marhaban, M.H., Saripan, M.I. et al. Review of visual odometry: types, approaches, challenges, and applications. SpringerPlus 5, 1897 (2016).
[8] W. Lv, Y. Kang and Y. Zhao, "FVC: A Novel Nonmagnetic Compass," in IEEE Transactions on Industrial Electronics, vol. 66, no. 10, pp. 7810-7820, Oct. 2019, doi: 10.1109/TIE.2018.2884231.
[9] Saeed Saeedvand, Hadi S. Aghdasi, Jacky Baltes, Novel lightweight odometric learning method for humanoid robot localization, Mechatronics, Volume 55, 2018, Pages 38-53.
[10] E. Wirbel, B. Steux, S. Bonnabel and A. de La Fortelle, "Humanoid robot navigation: From a visual SLAM to a visual compass," 2013 10th IEEE INTERNATIONAL CONFERENCE ON NETWORKING, SENSING AND CONTROL (ICNSC), Evry, 2013, pp. 678-683, doi: 10.1109/ICNSC.2013.6548820.
[11] O. Aşık and H. L. Akın, "A visual compass for robot soccer," 2014 22nd Signal Processing and Communications Applications Conference (SIU), Trabzon, 2014, pp. 2003-2006, doi: 10.1109/SIU.2014.6830651.
[12] Gian Luca Mariottini, Stefano Scheggi, Fabio Morbidi, Domenico Prattichizzo, An accurate and robust visual-compass algorithm for robot-mounted omnidirectional cameras, Robotics and Autonomous Systems, Volume 60, Issue 9, 2012, Pages 1179-1190,
[13] L. Vachhani and A. Sabnis, "Simple and robust correlation based visual compass using omnidirectional view," TENCON 2011 - 2011 IEEE Region 10 Conference, Bali, 2011, pp. 1180-1184, doi: 10.1109/TENCON.2011.6129298.
[14] F. Morbidi and G. Caron, "Phase Correlation for Dense Visual Compass From Omnidirectional Camera-Robot Images," in IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 688-695, April 2017, doi: 10.1109/LRA.2017.2650150.
[15] S. Thompson, S. Kagami and K. Nishiwaki, "Localisation for Autonomous Humanoid Navigation," 2006 6th IEEE-RAS International Conference on Humanoid Robots, Genova, 2006, pp. 13-19, doi: 10.1109/ICHR.2006.321357.
[16] Chang, Shih-Hung, Chih-Hsien Hsia, Wei-Hsuan Chang, and Jen-Shiun Chiang. "Self-localization based on monocular vision for humanoid robot." Tamkang Journal of Science and Engineering 14, no. 4 (2011): 323-332.
[17] A. Vardy, "A simple visual compass with learned pixel weights," 2008 Canadian Conference on Electrical and Computer Engineering, Niagara Falls, ON, 2008, pp. 001581-001586, doi: 10.1109/CCECE.2008.4564808.
[18] Liu, Y., Zhang, Y., Li, P. et al. Uncalibrated downward-looking UAV visual compass based on clustered point features. Sci. China Inf. Sci. 62, 199202 (2019). https://doi.org/10.1007/s11432-018-9748-1
[19] D. Pei, R. Ji, F. Sun and H. Liu, "Estimating viewing angles in mobile street view search," 2012 19th IEEE International Conference on Image Processing, Orlando, FL, 2012, pp. 441-444, doi: 10.1109/ICIP.2012.6466891.
[20] W. Stürzl, "A Lightweight Single-Camera Polarization Compass with Covariance Estimation," 2017 IEEE International Conference on Computer Vision (ICCV), Venice, 2017, pp. 5363-5371, doi: 10.1109/ICCV.2017.572.
[21] Guo, R.; Peng, K.; Zhou, D.; Liu, Y. Robust Visual Compass Using Hybrid Features for Indoor Environments. Electronics 2019, 8, 220.
[22] A. Sabnis and L. Vachhani, "A multiple camera based visual compass for a mobile robot in dynamic environment," 2013 IEEE International Conference on Control Applications (CCA), Hyderabad, 2013, pp. 611-616, doi: 10.1109/CCA.2013.6662817.
[23] J. M. M. Montiel and A. J. Davison, "A visual compass based on SLAM," Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006., Orlando, FL, 2006, pp. 1917-1922, doi: 10.1109/ROBOT.2006.1641986.
[24] Anderson P., Hengst B. (2014) Fast Monocular Visual Compass for a Computationally Limited Robot. In: Behnke S., Veloso M., Visser A., Xiong R. (eds) RoboCup 2013: Robot World Cup XVII. RoboCup 2013. Lecture Notes in Computer Science, vol 8371. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44468-9_22
[25] J. Sturm, A. Visser, An appearance-based visual compass for mobile robots, Robotics and Autonomous Systems, Volume 57, Issue 5, 2009, Pages 536-545, ISSN 0921-8890, https://doi.org/10.1016/j.robot.2008.10.002.
[26] Methenitis, G., Kok, P., Nugteren, S., & Visser, A. (2013). Orientation finding using a grid based visual compass. BNAIC 2013.
[27] ROBOTIS Inc. (December 19). ROBOTIS OP3 Kid-size Open Humanoid Platform. Available: http://en.robotis.com/model/page.php?co_id=prd_op3
[28] Saeedvand, S., Jafari, M., Aghdasi, H., & Baltes, J. (2019). A comprehensive survey on humanoid robot development. The Knowledge Engineering Review, 34, E20. doi:10.1017/S0269888919000158
[29] Baltes, J., Tu, K., Sadeghnejad, S., & Anderson, J. (2017). HuroCup: Competition for multi-event humanoid robot athletes. The Knowledge Engineering Review, 32, E1.
[30] S. Garrido-Jurado, R. Muñoz-Salinas, F. J. Madrid-Cuevas, and M. J. Marín-Jiménez. 2014. "Automatic generation and detection of highly reliable fiducial markers under occlusion". Pattern Recogn. 47, 6 (June 2014), 2280-2292. DOI=10.1016/j.patcog.2014.01.005