簡易檢索 / 詳目顯示

研究生: 謝翔宇
Hsieh, Hsiang-Yu
論文名稱: 整合機器視覺與機器手臂快換裝置之虛實整合技術應用於彈性組裝任務學習
A Cyber-Physical System Approach with the Integration of Machine Vision and Robotic Tool Changing for Flexible Assembly Task Learning
指導教授: 陳瑄易
Chen, Syuan-Yi
蔣欣翰
Chiang, Hsin-Han
口試委員: 林政宏
Lin, Cheng-Hung
蔣欣翰
Chiang, Hsin-Han
陳瑄易
Chen, Syuan-Yi
口試日期: 2022/08/03
學位類別: 碩士
Master
系所名稱: 電機工程學系
Department of Electrical Engineering
論文出版年: 2022
畢業學年度: 110
語文別: 中文
論文頁數: 55
中文關鍵詞: 虛實整合機器人視覺任務學習快換系統彈性製造
英文關鍵詞: Cyber-physical systems, machine vision, skill learning, tool changers, flexible manufacturing
研究方法: 實驗設計法
DOI URL: http://doi.org/10.6345/NTNU202201776
論文種類: 學術論文
相關次數: 點閱:82下載:12
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 因應現今大量客製化且彈性化的製造流程,工廠中的生產線必須時常進行任務的轉換,而大量花費人工與時間進行機械手臂實機調教的方法難以達到高效率生產目標,因此本論文提出一套虛實整合技術應用於機械手臂的任務學習中,透過虛擬環境中以軟體示教方式完成機械手臂高階任務決策模型的學習。學習演算法的開發主要是透過任務樹(Task Tree)決策模型進行復雜任務的自動規劃,此決策模型先從虛擬環境機械手臂的動作中學習完成複雜的組合任務,並輸出相對應任務的動作命令來控制機械手臂。該決策模型可即時在手臂端進行決策思考,並且當有新的組裝任務時,也可快速將新的操作步驟加入至原本模型中。本論文所開發之虛實整合技術具備以下特點:步驟合理性分析、快速新增新任務以及物件特性分析,首先導入支持向量機演算法(Support Vector Machine,SVM)使能夠判斷物件擺放狀態及夾取物件的種類,將物件特性考慮至決策模型中,決策模型可決策該物件合適的夾爪並且透過快換裝置進行治具的快速替換。在實驗驗證方面,透過機械手臂搭配快換裝置進行加工及自動組裝燈具之操作任務來展示本論文所提出之虛實整合技術可應用於彈性製造的解決方案。

    Due to increasing heavy demands for the customized and flexible manufacturing process, the production line in the factory has to handle a variety of different products from task to task. The traditional method of using many robotic arms that requires lots of human resources and is time-consuming on machine tuning is no longer applicable. To this end, this thesis follows the concept of cyber-physical systems (CPS) for developing a task learning approach for the robotic arm to achieve high-level decision-making through learning from demonstration in the virtual environment. The task tree algorithm is employed for the automatic planning of complex tasks, in which a decision-making model is presented to generate complex task sets from a large number of actions. Then, the decision-making model can export tasks and control the robotic arm to yield the correct operation in real-time when different tasks are endowed to a robotic arm by means of arranging a new task or new steps to the existing model. The decision-making model proposed in our thesis can analyze the rationality of model steps, quickly add new tasks, and perform object analysis. A support vector machine (SVM) algorithm is used to identify the state of an object and a suitable gripper for the object. By considering the characteristics of the relevant object, the decision-making model can then select a suitable gripper and switch between grippers quickly. The decision tree algorithm can be applied to complex tasks and thus can replace expert systems used for adjustment. The conducted experiments demonstrate the machining task and assembly task to investigate the proposed system capability towards the feasible solution to the flexible manufacturing.

    誌 謝 i 摘  要 ii ABSTRACT iii 目  錄 v 表 目 錄 viii 圖 目 錄 ix 第一章 緒論 1 1.1 研究背景 1 1.2 研究動機與目的 2 1.3 文獻探討 4 1.3.1 快換裝置系統 4 1.3.2 機械手臂應用之發展 5 1.3.3 虛實整合 9 1.4 論文架構 10 第二章 機械手臂系統架構及設備 11 2.1 實驗平台簡介 11 2.2 硬體設備 12 2.2.1 機械手臂 12 2.2.2 末端執行器 13 2.2.3 快換裝置 14 2.2.4 資料擷取模組 16 2.3 機械手臂系統 17 2.3.1 影像模組 17 2.3.2 系統組成與訊號傳輸簡介 18 2.3.3 人機介面設計 18 2.4 虛實整合架構設計 20 2.4.1 V-REP軟體簡介 20 2.4.2 模擬環境建置 20 2.4.3 V-REP與Python整合設計 22 2.4.4 虛實整合系統架構 22 第三章 研究內容與方法 24 3.1 物件偵測 24 3.2 機械手臂運動學 28 3.2.1 機器人運動學建模及DH參數表示法 28 3.2.2 正向運動學分析 29 3.2.3 逆向運動學分析 31 3.3 動作設計與控制 34 3.4 機械手臂任務樹高階任務學習 38 第四章 實驗結果與分析 41 4.1 SVM模型辨識效果分析 41 4.2 任務樹動作編成分析 43 4.3 實驗結果 44 4.3.1 機械手臂銑刀加工 44 4.3.2 燈具組裝與拆解 48 第五章 結論與未來展望 52 5.1 結論 52 5.2 未來展望 52 參考文獻 53

    [1] C. Cortes and V. Vapnik, "Support-vector networks," Machine learning, vol. 20, no. 3, pp. 273-297, 1995.
    [2] L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and regression trees. Routledge, 2017.
    [3] KOSMEK Corporation, Robotic Hand Changer, http://www.kosmek.com/, May 1999.
    [4] B.-S. Ryuh, S. M. Park, and G. R. Pennock, "An automatic tool changer and integrated software for a robotic die polishing station," Mechanism and Machine Theory, vol. 41, no. 4, pp. 415-432, 2006.
    [5] Z. Lianzhong and W. Li, "Machining center automatic ATC analysis and research," in 2010 3rd International Conference on Information Management, Innovation Management and Industrial Engineering, 2010, vol. 2: IEEE, pp. 355-358.
    [6] J. P. Rogelio and R. G. Baldovino, "Development of an automatic tool changer (ATC) system for the 3-axis computer numerically-controlled (CNC) router machine: Support program for the productivity and competitiveness of the metals and engineering industries," in 2014 International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), 2014: IEEE, pp. 1-5.
    [7] C. Obreja, G. Stan, D. Andrioaia, and M. Funaru, "Design of an automatic tool changer system for milling machining centers," in Applied Mechanics and Materials, 2013, vol. 371: Trans Tech Publ, pp. 69-73.
    [8] CNC快換裝置示意圖,取自https://shop.stepcraft-systems.com/Tool-Changers
    [9] ATI QC-310快換裝置,取自https://www.newequipment.com/plant-operations/article/22059508/electric-cars-power-growth-for-robotic-tool-changers
    [10] 採椒機器人,取自https://www.nbcnews.com/mach/science/new-pepper-picking-robot-isn-t-fast-it-can-work-ncna950846
    [11] 達文西機械手臂,取自https://www.cgh.org.tw/ec99/rwd1320/category.asp?category_id=139
    [12] 排爆機器人,取自https://www.popsci.com/police-used-bomb-disposal-robot-to-kill-dallas-shooting-suspect/
    [13] P. Nerakae, P. Uangpairoj, and K. Chamniprasart, "Using machine vision for flexible automatic assembly system," Procedia Computer Science, vol. 96, pp. 428-435, 2016.
    [14] E. N. Malamas, E. G. M. Petrakis, M. Zervakis, L. Petit, and J. D. Legat, “A survey on industrial vision systems, applications and tools,” Image and Vision Computing, vol. 21, pp. 171-188, 2003.
    [15] F. Cheng, "Robot manipulation of 3D cylindrical objects with a robot-mounted 2D vision camera," in 2017 Computing Conference, 2017: IEEE, pp. 192-199.
    [16] P.-J. Hwang, C.-C. Hsu, P.-Y. Chou, W.-Y. Wang, and C.-H. Lin, "Vision-Based Learning from Demonstration System for Robot Arms," Sensors, vol. 22, no. 7, p. 2678, 2022.
    [17] T. Yu et al., "One-shot imitation from observing humans via domain-adaptive meta-learning," arXiv preprint arXiv:1802.01557, 2018.
    [18] K. French, S. Wu, T. Pan, Z. Zhou, and O. C. Jenkins, "Learning behavior trees from demonstration," in 2019 International Conference on Robotics and Automation (ICRA), 2019: IEEE, pp. 7791-7797.
    [19] Z. Lončarević, A. Gams, and A. Ude, "Robot skill learning in latent space of a deep autoencoder neural network," Robotics and Autonomous Systems, vol. 135, p. 103690, 2021.
    [20] F. Tao, Q. Qi, L. Wang, and A. Nee, "Digital twins and cyber–physical systems toward smart manufacturing and industry 4.0: Correlation and comparison," Engineering, vol. 5, no. 4, pp. 653-661, 2019.
    [21] N. Nikolakis, V. Maratos, and S. Makris, "A cyber physical system (CPS) approach for safe human-robot collaboration in a shared workplace," Robotics and Computer-Integrated Manufacturing, vol. 56, pp. 233-243, 2019.
    [22] Denso工業型六軸機械手臂,取自https://www.denso-wave.com/en/robot/product/five-six/vs.html
    [23] 台灣氣立股份有限公司,https://www.chelic.com/website/tw/index.html
    [24] 廖俐智,“結合機器視覺之工業用機械手臂夾爪快換系統研製”,天主教輔仁大學電機工程學系碩士論文,中華民國108年12月。
    [25] USB-4750 資料擷取模組,取自https://iotmart.advantech.com.tw/Data-Acquisition-Control/Data-Acquisition-Control-Card-USB-Interface-DAQ/model-USB-4750-BE.htm
    [26] V-REP虛擬環境,https://www.coppeliarobotics.com/
    [27] J. J. Craig, Introduction to robotics: mechanics and control. Pearson Educacion, 2005.
    [28] S. Asif and P. Webb, "Kinematics analysis of 6-DoF articulated robot with spherical wrist," Mathematical Problems in Engineering, vol. 2021, 2021.

    下載圖示
    QR CODE