簡易檢索 / 詳目顯示

研究生: 朱苓語
Chu, Ling-Yu
論文名稱: 基於單應性轉換與支持向量回歸之注視預測研究
Gaze Estimation Based on Homography Transformation and Support Vector Regression
指導教授: 李忠謀
Lee, Chung-Mou
口試委員: 方瓊瑤
Fang, Chiung-Yao
江政杰
Chiang, Cheng-Chieh
葉富豪
Yeh, Fu-Hao
李忠謀
Lee, Chung-Mou
口試日期: 2021/09/10
學位類別: 碩士
Master
系所名稱: 資訊工程學系
Department of Computer Science and Information Engineering
論文出版年: 2021
畢業學年度: 109
語文別: 中文
論文頁數: 41
中文關鍵詞: 瞳孔中心偵測注視區域預測單應性轉換矩陣支持向量回歸
英文關鍵詞: Gaze estimation, Pupil center detection, Homography transformation matrix, Support Vector Regression
研究方法: 實驗設計法文件分析法
DOI URL: http://doi.org/10.6345/NTNU202101341
論文種類: 學術論文
相關次數: 點閱:141下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 摘要............................................................................................................................................. i Abstract ...................................................................................................................................... ii 圖附錄........................................................................................................................................ v 表附錄...................................................................................................................................... vii 第壹章 緒論........................................................................................................................ 1 1. 1 研究動機......................................................................................................................... 1 1. 2 研究目的 ........................................................................................................................ 2 1. 3 研究範圍與限制 ............................................................................................................ 2 第貳章 文獻探討................................................................................................................ 3 2. 1 注視點估計方法............................................................................................................. 3 2. 1. 1 Geometric-based Approaches ..................................................................................... 3 2. 1. 2 Feature-based Approaches .......................................................................................... 4 2. 1. 3 Appearance-based Approaches ................................................................................... 5 2. 2 瞳孔偵測方法 ................................................................................................................ 6 2. 2. 1 Template Matching Method ....................................................................................... 6 2. 2. 2 Feature-based Method ................................................................................................. 7 第參章 研究方法................................................................................................................ 8 3. 1 系統架構......................................................................................................................... 8 3. 2 資料前處理 .................................................................................................................... 9 3. 3 基於單應性轉換矩陣的眼動向量校正 ...................................................................... 11 3. 3. 1 計算瞳孔中心........................................................................................................... 12 3. 3. 2 設立面部參考點....................................................................................................... 15 3. 3. 3 計算單應性轉換矩陣............................................................................................... 16 3. 4 基於支持向量回歸的注視點補償模型....................................................................... 18 3. 4. 1 計算偏移向量數據................................................................................................... 19 3. 4. 2 支持向量回歸模型................................................................................................... 20 第肆章 實驗與結果討論.................................................................................................. 22 4. 1. 實驗數據收集方式 ..................................................................................................... 22 4. 2 瞳孔中心偵測實驗....................................................................................................... 23 4. 2. 1 實驗一 : 以CASIA資料庫評估瞳孔中心偵測演算法準確率 .............................. 23 4. 2. 2 實驗二 : 自行收集的受試者影像計算瞳孔中心偵測準確率 ............................... 25 4. 3 頭部移動對於注視區域預測分析實驗結果............................................................... 26 4. 3. 1 實驗一 : 數據的影像取樣數量對於注視區域預測影響 ....................................... 27 4. 3. 2 實驗二 : 頭部距離鏡頭遠近之注視預測準確度分析實驗 ................................... 28 4. 3. 3 實驗三 : 頭部移動幅度與注視區域準確率分析 ................................................... 31 4. 4 演算法評估 .................................................................................................................. 34 第伍章 結論及未來研究.................................................................................................. 36 5. 1 結論............................................................................................................................... 36 5. 2 應用............................................................................................................................... 36 5. 3 未來研究....................................................................................................................... 37 參考文獻.................................................................................................................................. 38

    [1] Anon, Support vector regression. Accessed August 26, 2021.
    [2] Ahmed, S., Fadhil, M., & Abdulateef, S. (2020). Enhancing Reading Advancement Using Eye Gaze Tracking. Iraqi Journal for Electrical and Electronic Engineering, sceeer(3d), 59–64, Taipei, Taiwan, January.
    [3] Bay, H., Tuytelaars, T., & Gool, L.Van. (2006). SURF: Speeded Up Robust Features. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3951 LNCS, 404–417, Beijing, China, October.
    [4] Blignaut, P. (2014). Mapping the Pupil-Glint Vector to Gaze Coordinates in a Simple Video-Based Eye Tracker. Journal of Eye Movement Research, 7(1), South Africa, March.
    [5] Chen, S., & Liu, C. (2015). Eye detection using discriminatory Haar features and a new efficient SVM. Image and Vision Computing, 33, 68–77, USA, January.
    [6] Cheng, Y., Lu, F., & Zhang, X. (2018). Appearance-Based Gaze Estimation via Evaluation-Guided Asymmetric Regression. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11218 LNCS, 105–121, Munich, Germany, October.
    [7] Conte, D., Lascio, R.Di, Foggia, P., Percannella, G., & Vento, M. (n.d.). Pupil Localization by a Template Matching Method. Proceedings of the International Conference on Computer Vision Theory and Applications (VISAPP), Italy, January.
    [8] Gwon, S. Y., Cho, C. W., Lee, H. C., Lee, W. O., & Park, K. R. (2017). Robust Eye and Pupil Detection Method for Gaze Tracking. Int. J. Adv. Robot. Syst., vol. 10, no. 2, p. 98, United Stateds, May.
    [9] Hansen, D. W., & Ji, Q. (2010). In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500, United States, March.
    [10] Hennessey, C., & Lawrence, P. (n.d.). A Single Camera Eye-Gaze Tracking System with Free Head Motion. Proceedings of the 2006 Symposium on Eye Tracking Research & Applications - ETRA ’06, March.
    [11] Hu, D., Qin, H., Liu, H., & Zhang, S. (2019). Gaze Tracking Algorithm Based on Projective Mapping Correction and Gaze Point Compensation in Natural Light∗. Proceedings of IEEE International Conference on Control and Automation, ICCA, 2019-July, 1150–1155, Edinburgh, Scotland, July.
    [12] Institute of Automation, Chinese Academy of Science: CASIA v4.0 Iris Image Database, 2018. Accessed 6 April, 2021.
    [13] Kar, A., & Corcoran, P. (2017). A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access, 5, 16495–16519, August.
    [14] Kawaguchi, T., & Rizon, M. (2003). Iris detection using intensity and edge information. Pattern Recognition, 36(2), 549–562, Madison, Wisconsin, February.
    [15] Lowe, D. G. (2004). Distinctive Image Features from Scale-Invariant Keypoints. International Journal of Computer Vision 2004 60:2, 60(2), 91–110, Netherlands, November.
    [16] Lu, F., Okabe, T., Sugano, Y., & Sato, Y. (2014). Learning gaze biases with head motion for head pose-free gaze estimation. Image and Vision Computing, 32(3), 169–179, United Kingdom, January.
    [17] Mohammadi, M. R., & Raie, A. (2012). Robust pose-invariant eye gaze estimation using geometrical features of iris and pupil images. Proceedings of ICEE 2012 - 20th Iranian Conference on Electrical Engineering, 593–598, Tehran, Iran, September.
    [18] Mohsin, H., & Abdullah, S. H. (2018). Pupil detection algorithm based on feature extraction for eye gaze. Proceedings of 2017 6th International Conference on Information and Communication Technology and Accessbility, ICTA 2017, 2017-December, 1–4, Muscat, Sultanted of Oman, December.
    [19] Park, J., Jung, T., & Yim, K. (2015). Implementation of an eye gaze tracking system for the disabled people. Proceedings of International Conference on Advanced Information Networking and Applications, AINA, 2015-April, 904–908, Gwangju, Korea, March.
    [20] Timm, F., & Barth, E. (2011). Accurate eye centre localisation by means of gradients. VISAPP 2011 - Proceedings of the International Conference on Computer Vision Theory and Application, 125–130, France, January.
    [21] VSheela, S., & A Vijaya, P. (2011). Mapping Functions in Gaze Tracking. International Journal of Computer Applications, 26(3), 36–42, July.
    [22] Valenti, R., & Gevers, T. (2008). Accurate eye center location and tracking using isophote curvature. Proceedings of 26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR, Alaska, USA, June.
    [23] Valenti, R., Staiano, J., Sebe, N., & Gevers, T. (2009). Webcam-based visual gaze estimation. Proceedings of Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 5716 LNCS, 662–671, Italy, September.
    [24] Villanueva, A., J., J., & Cabez, R. (2008). Geometry Issues of Gaze Estimation. Proceedings of Advances in Human Computer Interaction, Beijing, China, October.
    [25] Viola, P., & Jones, M. (2001). Robust real-time face detection. Proceedings of the IEEE International Conference on Computer Vision, 2, 747, Columbia, Canada, July.
    [26] Wang, X., Liu, K., & Qian, X. (2016). A survey on gaze estimation. Proceedings of The 2015 10th International Conference on Intelligent Systems and Knowledge Engineering, ISKE 2015, 260–267, Taipei, Taiwan, January.
    [27] Wu, Y.-L., Yeh, C.-T., Hung, W.-C., & Tang, C.-Y. (2012). Gaze direction estimation using support vector machine with active appearance model. Multimedia Tools and Applications 2012 70:3, 70(3), 2037–2062, September.
    [28] Zhu, Z., & Ji, Q. (2004). Eye and gaze tracking for interactive graphic display. Machine Vision and Applications 2004 15:3, 15(3), 139–148, Germany, July.
    [29] 辛孟錩. (2013). 基於影像結構相似性指標的單攝影機視線追蹤系統與應用. 國立東華大學電機工程學系碩士論文.
    [30] 李欣芸. (2020). 基於循環神經網路之注視區域分析. 國立臺灣師範大學資訊工程學系碩士論文.
    [31] 林瑞硯. (2011). 使用網路攝影機即時人眼偵測與注視點分析. 國立臺灣師範大學資訊工程學系碩士論文.
    [32] 許雅淳. (2013). 使用單一網路攝影機之視線判斷. 國立臺灣師範大資訊工程學系碩士論文.
    [33] 陳美琪. (2014). 基於二階層式支持向量機之即時注視區域分析. 國立臺灣師範大學資訊工程學系碩士論文.
    [34] 簡菁怡. (2009). 以彩色影像辨識為基礎之眼控系統研究與應用. 南台科技大學電機工程學系碩士論文.
    [35] 廖英傑. (2006). 可以自由移動頭部之視線追蹤演算法. 國立中央大學資訊工程學系碩士論文

    無法下載圖示 電子全文延後公開
    2026/09/22
    QR CODE