簡易檢索 / 詳目顯示

研究生: 林士勛
論文名稱: 以模糊理論為基礎應用超音波感測器之未知環境地圖建置
Map Building of Unknown Environment Based on Fuzzy Sensor Fusion of Ultrasonic Ranging Data
指導教授: 許陳鑑
Hsu, Chen-Chien
洪欽銘
Hong, Chin-Ming
學位類別: 碩士
Master
系所名稱: 工業教育學系
Department of Industrial Education
論文出版年: 2012
畢業學年度: 100
語文別: 中文
論文頁數: 75
中文關鍵詞: 地圖建構模糊邏輯超音波感測器模型感測器融合不確定性機器人導航
英文關鍵詞: Map building, fuzzy logic, ultrasonic, sensor model, sensor fusion, uncertainty, robot, navigation
論文種類: 學術論文
相關次數: 點閱:148下載:17
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文係以裝載在Pioneer3-DX雙輪自走車之超音波感測器進行環境偵測,利用所蒐集到的感測器資料建立室內未知環境地圖,並以格點地圖方式呈現。由於超音波感測器所蒐集到之環境量測數據具有不確定性,因此本論文提出一模糊資料融合的方法,利用實驗結果所歸納出的超音波感測器模型,以解決超音波所具有的角度不確定性及多次反射易造成距離誤判的缺陷問題,再以模糊邏輯運算將感測器資料加以融合。隨著自走車的移動,地圖內的格點資訊將不斷地被計算與更新,最後可獲得一完整之障礙物環境格點地圖,可提供移動式機器人做為定位、導航或路徑規劃之依據,增加其自主運行的能力。文末以本校科技學院及演化控制實驗室外走廊等局部環境,利用自走車建置環境地圖,實驗結果證實本論文所提出的模糊資料融合方法所建構之未知環境地圖之可行性。

    This thesis investigates the use of ranging data collected from the ultrasonic sensors mounted on a two-wheel mobile robot, Pioneer3-DX, to build occupancy grid maps of an unknown indoor environment based on fuzzy sensor fusion. Because of uncertainties inevitably encountered by using ultrasonic sensors, a more reliable sensor model is designed to solve the problems of angle uncertainties and multiple reflections. To address the problems due to measurement uncertainties of the ultrasonic sensors, a fuzzy logic approach is proposed to construct the grid map, where the information of the grids are continually computed and undated through fuzzy logic operations. As long as the environmental map is obtained, where every grid in the map is represented as the possibility of occupancy, it can be used for localization, navigation, or path planning to strengthen the autonomy of mobile robots. To validate the feasibility of the proposed approach, we also conduct experiments to build maps in the Technology Building of the University.

    誌謝.......................................................i 中文摘要..................................................ii 英文摘要.................................................iii 目錄.......................................................v 圖目錄...................................................vii 表目錄.....................................................x 第一章 緒論...............................................1  1.1 研究背景與動機......................................1  1.2 研究目的............................................3  1.3 研究限制與方法......................................4  1.4 論文章節組織........................................4  1.5 研究步驟............................................5 第二章 文獻探討與回顧.....................................7  2.1 地圖建構之探討......................................7  2.2 利用超音波感測器建置環境地圖.......................11 第三章 模糊理論..........................................14  3.1 前言...............................................14  3.2 模糊理論基本概念...................................16  3.3 模糊集合之運算.....................................17  3.4 模糊控制系統.......................................20 第四章 超音波感測器......................................28  4.1 超音波距離量測.....................................28  4.2 超音波感測器模型...................................33 第五章 利用模糊理論之環境地圖建置........................37  5.1 系統架構...........................................37  5.2 超音波輸入模糊化...................................39  5.3 模糊邏輯推論及規則庫...............................41  5.4 格點地圖更新.......................................43 第六章 實驗結果與分析....................................45  6.1 系統開發環境.......................................45  6.2 格點更新觀察.......................................47  6.3 環境地圖建置.......................................51  6.4 討論與分析.........................................62 第七章 結論與後續研究....................................66  7.1 結論...............................................66  7.2 後續研究...........................................67 參考文獻..................................................68

    [1] 三菱電機股份有限公司, http://www.mitsubishielectric.com.tw
    [2] 日本愛知世博會報導, http://photo.dayoo.com/gb/content/2005-03/28/ content_1990199.htm
    [3] 蘇州博實機器人技術有限公司, http://www.bsrobot.com.cn
    [4] 電子工程專輯網, http://www.eettaiwan.com
    [5] 豐田公司, http://www.toyota-global.com/innovation/partner_robot
    [6] 機器人世界情報網, http://www.robotworld.org.tw/index.htm
    [7] iRobot, http://store.irobot.com/home/index.jsp
    [8] 網昱多媒體, http://swf.com.tw
    [9] IEEE Spectrum, http://spectrum.ieee.org
    [10] M. R. Kabuka and A. E. Arenas, “Position Verification of a Mobile Robot Using Standard Pattern,” IEEE Journal of Robotics and Automation, vol. 3, no. 6, pp. 505-516, 1987.
    [11] A. Kosaka and A. C. Kak, “Fast Vision-Guided Mobile Robot Navigation Using Model-Based Reasoning and Prediction of Uncertainties,” Computer Vision, Graphics, and Image Processing—Image Understanding, vol. 56, no. 3, pp. 271-329, 1992.
    [12] S. Atiya and G.D. Hager, “Real-Time Vision-Based Robot Localization,” IEEE Transactions on Robotics and Automation, vol. 9, pp. 785-800, 1993.
    [13] C. C. Tsai, S. M. Hu, H. C. Huang, and S. M. Hsieh, “Fuzzy Hybrid Navigation of an Active Mobile Robotic Assistant : A multisensory fusion approach,” Proceedings of CACS International Automatic Control Conference, Taichung, 2007, pp. 1280-1285.
    [14] H. P. Moravec and A. Elfes, “High Resolution Maps from Wide Angle Sonar,” Proceedings of IEEE International Conference on Robotics and Automation, Missouri, 1985, pp. 116-121.
    [15] S. Thrun, “Learning Metric-Topological Maps for Indoor Mobile Robot Navigation,” Artificial Intelligence, vol. 99, no. 1, pp. 21-71, 1998.
    [16] N. Ayache and O. D. Faugeras, “Maintaining Representations of the Environment of a Mobile Robot,” IEEE Transactions on Robotics and Automation, vol. 5, no. 6, pp. 804-819, 1998.
    [17] S. Se, D. Lowe, and J. Little, “Mobile robot localization and mapping with uncertainty using scale-invariant visual landmarks,” International Journal of Robotics Research, vol. 21, no. 8, pp. 735–758, 2002.
    [18] S. Se, D. G. Lowe, and J. J. Little, “Vision-Based Global Localization and Mapping for Mobile Robots,” IEEE Transactions on Robotics, vol. 21, no. 3, 2005.
    [19] S. Y. Chung and H. P. Huang, “Relative-Absolute Map Filter for Simultaneous Localization and Mapping,” Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, 2006, pp. 436-441.
    [20] M. Meng and A.C. Kak, “NEURO-NAV: A Neural Network Based Architecture for Vision-Guided Mobile Robot Navigation Using Non-Metrical Models of the Environment,” Proceedings of IEEE International Conference on Robotics and Automation, Atlanta, 1993, pp. 750-757.
    [21] 張家瑋, 研製具有探索未知室內環境功能之影像導航自走車, 聖約翰科技大學電機工程系碩士論文, 2009年.
    [22] 陳秉宏, 超音波感測資訊融合之未知環境地圖建立, 淡江大學電機工程學系碩士論文, 2011年.
    [23] T. Bailey, E. Nebot, J. Rosenblatt, and H. Durrant-Whyte, “Robust distinctive place recognition for topological maps,” Proceedings of the International Conference on Field and Service Robotics, Pittsburgh, 1999, pp. 347-352.
    [24] B. J. Kuipers and Y.-T. Byun, “A robot exploration and mapping strategy based on a semantic hierarchy of spatial representations,” Robotics and Autonomous Systems, vol. 8, no. 1-2, pp. 47–63, 1991.
    [25] C. Shi, Y. Wnag, and J. Yang, “Online topological map building and qualitative localization in large-scale environment,” Robotics and Autonomous Systems, vol. 58, no. 5, pp. 488-496, 2010.
    [26] A. Elfes, “Occupancy grids: A stochastic spacial representation for active robot perception,” Proceedings of the Sixth Conference Annual Conference on Uncertainty in Artificial Intelligence, New York, 1990, pp. 136-146.
    [27] S. Thrun, “Learning occupancy grid Maps with forward sensor models,” Journal of Autonomous Robots, vol. 15, no. 2, pp. 111–127, 2003.
    [28] K. S. Chong and L. Kleeman, “Mobile robot map building from an advanced sonar array and accurate odometry,” International Journal of Robotics Research, vol. 18, no. 1, pp. 20-36, 1999.
    [29] J. J. Leonard, H. F. Durrant-Whyte, and I. J. Cox., “Dynamic map building for an autonomous mobile robot,” International Journal of Robotics Research, vol. 11, no. 4, pp. 286–297, 1992.
    [30] A. Elfes, Occupancy grids: A probabilistic framework for robot perception and navigation, Ph.D. dissertation, CMU, 1989.
    [31] S. J. Lee, Y. Lee, J.-H. Lim, C.-U. Kang, D.-W. Cho, W.-K. Chung, and W. S. Yun, “Evaluation of Features through Grid Association for Building a Sonar Map,” Proceedings of IEEE International Conference on Robotics and Automation, Orlando, 2006, pp. 2615-2620.
    [32] H. M. Wang, Z. G. Hou, J. Ma, Y. C. Zhang, Y. Q. Zhang, and M. Tan, “Sonar Feature Map Building for a Mobile Robot,” Proceedings of IEEE International Conference on Robotics and Automation, Rome, 2007, pp. 4152–4157.
    [33] D. Kortenkamp and T. Weymouth, “Topological Mapping for Mobile Robots Using a Combination of Sonar and Vision Sensing,” Proceedings of the twelfth National Conference on Artificial Intelligence, Seattle, 1994, pp. 979-984.
    [34] J. Modayil, P. Beeson, and B. Kuipers. “Using the topological skeleton for scalable global metrical map-building,” Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Japan, 2004, pp. 1530–1536.
    [35] W. Greg and B. Gary, An Introduction to the Kalman Filter, Technical Report, University of North Carolina, 1995.
    [36] Z. Y, H. Y. Khing, C. C. Seng, and Zhou-Xiao Wei, “Multi-ultrasonic sensor fusion for mobile robots,” Proceedings of the IEEE Intelligent Vehicles Symposium, Dearborn, 2000, pp. 387-391.
    [37] P. Sykacek and I. Rezek, Markov Chain Monte Carlo Methods for Bayesian Sensor Fusion, University of Oxford, 2000.
    [38] H. M. Barbera, A. G. Skarmeta, M. Z. Izquierdo, and J. B. Blaya, “Neural Networks for Sonar and Infrared Sensors Fusion,” Proceedings of the 3th International Conference on Information Fusion, Paris, 2000, pp. 18-25.
    [39] M. Lopez, F. J. Rodriguez, and J. C. Corredra, “Fuzzy Reasoning for Multisensor Management,” IEEE International Conference on Systems, Man and Cybernetics, Canada, 1995, pp. 1398-1403.
    [40] Sv. Noykov and Ch. Roumenin, “Occupancy grids building by sonar and mobile robot,” Robotics and Autonomous Systems, vol. 55, no. 2, pp. 162-175, 2007.
    [41] M. A. Lanthier, D. Nussbaum, and A. Sheng, “Improving vision-based maps by using sonar and infrared data,” Proceedings of the IASTED International Conference on Robotics and Applications, Honolulu, 2004, pp. 118-123.
    [42] M. Kam, X. Zhu, and P. Kalata, “Sensor fusion for mobile robot navigation,” IEEE Transactions on Industrial Electronics, vol. 85, no. 1, pp. 108-119, 1997.
    [43] T. Wilhelm, H. J. Bohme, and H. M. Gross, “Sensor Fusion for Vision and Sonar Based People Tracking on a Mobile Service Robot,” Proceedings of International Workshop on Dynamic Perception, Bochum, 2002, pp. 315-320.
    [44] F. Wallner and R. Dillmann, “Real-time map refinement byuse of sonar and active stereo-vision,” Robotics and Autonomous Systems, vol. 16, no. 1, pp. 47-56, 1995.
    [45] M. R. Asharif, B. Moshiri, and R. HoseinNezhad, “Sensor fusion by pseudo information measure: A mobile robot application,” ISA Transactions, vol. 41, no. 3, pp. 283-301, 2002.
    [46] J. W. M. Van Dam, B. J. A. Krose, and F. C. A. Groen, “Neural network Applications in Sensor Fusion for An Autonomous Mobile Robot,” Proceedings of Reasoning with Uncertainty in Robotics, Amsterdam, 1995, pp. 263-278.
    [47] C. Martin, E. Schaffernicht, A. Scheidig, and H. M. Gross, “Multi-modal sensor fusion using a probabilistic aggregation scheme for people detection and tracking,” Robotics and Autonomous Systems, vol. 54, no. 9, pp. 721-728, 2006.
    [48] J. J. Leonard and H. F. Durrant-Whyte, “Simultaneous Map Building and Localization for an Autonomous Mobile Robot,” Proceedings of IEEE/RSJ International Workshop on IROS, Osaka, 1991, pp. 1442-1447.
    [49] M. W. M. G. Dissanayake, P. Newman, S. Clark, H. F. Durrant-Whyte, and M. Csorba, “A Solution to the Simultaneous Localization and Map Building (SLAM) Problem,” IEEE Transactions on Robotics and Automation, vol. 17, no. 3, pp. 229-241, 2001.
    [50] T. Bailey and H. Durrant-Whyte, “Simultaneous Localization and Mapping (SLAM):Part I The Essential Algorithms,” IEEE Robotics and Automation Magazine, vol. 13, no. 2, pp. 99-110, 2006.
    [51] L. F. Gao, Y. X. Gai, and S. Fu, “Simultaneous Localization and Mapping for Autonomous Mobile Robots Using Binocular Stereo Vision System,” Proceedings of IEEE International Conference on Mechatronics and Automation, Harbin, 2007, pp. 326-330.
    [52] S. Datta, D. Banerji, and R. Mukherjee, “Mobile Robot Localization with Map Building and Obstacle Avoidance for Indoor Navigation,” Proceedings of IEEE International Conference on Industrial Technology, Bombay, 2006, pp. 2535-2540
    [53] G. Oriolo and G. Ulivi, “Real Time Map Building and Navigation for Autonomous Robots in Unknown Environments,” IEEE Transactions on Systems, Man and Cybernetics, vol. 28, no. 3, pp. 316-332, 1998.
    [54] 林于琬, 以超音波感測器建立自走車地圖之研究, 國立成功大學工程科學研究所碩士論文, 2005年.
    [55] 陳柏昌, 以超音波感測器於機器人環境地圖之建立, 國立成功大學工程科學研究所博士論文, 2007年.
    [56] A. C. Plascencia and J. D. Bendtsen, “Sensor Fusion Map Building-Based on Fuzzy Logic Using Sonar and SIFT Measurements,” Proceedings of the IEEE Conference on Soft Computing in Industrial Application, Japan, 2008, pp. 13-22.
    [57] G. Benet, M. Martınez, F. Blanes, P. Perez, and J. E. Simo, “Differentiating walls from corners using the amplitude of ultrasonic echoes,” Robotics and Autonomous Systems, vol. 50, no. 1, pp. 13-25, 2005.
    [58] J. Gasos and A. Martin, “A fuzzy approach to build sonar maps for mobile robots,” Computers in Industry, vol. 32, no. 2, pp. 151-167, 1996.
    [59] L. A. Zadeh, “Fuzzy sets,” Information and Control, vol. 8, no. 3, pp. 338-353, 1965.
    [60] 林信成、彭啟峰, Oh! Fuzzy模糊理論剖析, 第三波資訊股份有限公司, 1994.
    [61] 汪惠健, 模糊理論與應用, 培生教育出版集團, 2006年.
    [62] P. N. T. Wells, Biomedical Ultrasonic, Academic Press, New York, 1977.
    [63] B. Barshan and R. Kuc, “Differentiating Sonar Reflections from Corners and Planes by Employing an Intelligent Sensor,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 12, no. 6, pp. 560-569, 1990.
    [64] Polaroid, “Technical Specifications for 6500 Series Sonar Ranging Module,” 1999.

    下載圖示
    QR CODE