簡易檢索 / 詳目顯示

研究生: 李柏毅
Lee, Bor-Yi
論文名稱: 基於身體座標點與矩陣轉換之注視點估計研究
Eye Gaze Tracking Based on Body Coordinate Points and Matrix Transformation
指導教授: 李忠謀
Lee, Chung-Mou
口試委員: 李忠謀
Lee, Chung-Mou
江政杰
Chiang, Cheng-Chieh
劉寧漢
Liu, Ning-Han
蔣宗哲
Chiang, Tsung-Che
口試日期: 2024/07/30
學位類別: 碩士
Master
系所名稱: 資訊工程學系
Department of Computer Science and Information Engineering
論文出版年: 2024
畢業學年度: 112
語文別: 中文
論文頁數: 43
中文關鍵詞: Homography Transformation Matrix剛體變換機器學習視線判斷
英文關鍵詞: Homography transformation, Rigid transformation, Machine learning, Gaze estimation
研究方法: 實驗設計法
DOI URL: http://doi.org/10.6345/NTNU202401221
論文種類: 學術論文
相關次數: 點閱:209下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

致謝 i 摘要 ii Abstract iii 目錄 iv 圖附錄 vi 表附錄 viii 第壹章 緒論 1 1.1 研究動機 1 1.2 研究目的 2 1.3 研究範圍與限制 2 1.4 論文架構 2 第貳章 文獻探討 3 2.1 瞳孔偵測方法 3 2.1.1 基於樣板匹配的方式(Template Matching) 3 2.1.2 基於特徵的方式(Feature-based approach) 3 2.2 注視點估計方法 4 2.2.1 基於角膜反射的分析方法 4 2.2.2 基於特徵提取的方法 6 2.2.3 基於外觀的方法 6 第參章 研究方法 8 3.1 研究架構 8 3.2 系統校正 10 3.2.1 人臉與眼睛特徵偵測 10 3.2.2 基於Homography轉換矩陣之眼動特徵校正 12 3.3 注視點預測 15 第肆章 實驗結果與討論 18 4.1 實驗環境 18 4.2 身體晃動對於注視點預測之實驗 19 4.2.1 實驗一:數據的影像取樣量對於注視區域之預測 19 4.2.2 實驗二:特徵點數量對於機器學習模型之資料分析 21 4.2.3 實驗三:身體晃動對於注視點之預測 27 4.3 注視區域預測分析 33 4.4 演算法評估 36 第伍章 結論及未來研究 39 5.1 結論 39 5.2 應用 39 5.3 未來研究 40 參考文獻 41

[1] A. Kar and P. Corcoran, “A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms,” IEEE Access, vol. 5, pp. 16495-16519, August 2017.
[2] A. Villanueva, J. J. Cerrolaza, and R. Cabeza, "Geometry issues of gaze estimation," in Advances in Human Computer Interaction, InTech, 2008.
[3] B. P. Johnson, J. A. Lum, N. J. Rinehart, and J. Fielding, “Ocular motor disturbances in autism spectrum disorders: Systematic review and comprehensive meta-analysis,” Neuroscience & Biobehavioral Reviews, vol. 69, pp. 260-279, 2016.
[4] D. Hu, H. Qin, H. Liu, and S. Zhang, “Gaze Tracking Algorithm Based on Projective Mapping Correction and Gaze Point Compensation in Natural Light,” in Proceedings of IEEE International Conference on Control and Automation, ICCA, pp. 1150-1155, Edinburgh, Scotland, July 2019.
[5] E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Transactions on Biomedical Engineering, vol. 53, no. 6, pp. 1124-1133, 2006.
[6] F. Lu, T. Okabe, Y. Sugano, and Y. Sato, “Learning gaze biases with head motion for head pose-free gaze estimation,” Image and Vision Computing, vol. 32, no. 3, pp. 169-179, United Kingdom, January 2014.
[7] H. Balim, S. Park, Wang, X. Zhang, and O. Hilliges, “EFE: End-to-end Frame-to-Gaze Estimation,” in Proceedings of the 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), L. O’Conner, Ed., pp. 2688-2697, 2023.
[8] I. Mitsugami, N. Ukita, and M. Kidode, “Robot navigation by eye pointing,” in Lecture Notes in Computer Science, vol. 3711, p. 256, 2005.
[9] J. Lee and G. Kim, “Robust estimation of camera homography using fuzzy RANSAC,” in ICCSA 2007, LNCS 4705, Part I, O. Gervasi and M. Gavrilova, Eds., pp. 992-1002, 2007.
[10] L. Sesma, A. Villanueva, and R. Cabeza, “Evaluation of pupil center-eye corner vector for gaze estimation using a web cam,” in Proceedings of the ETRA ‘12: Eye Tracking Research and Applications, Stuttgart, Germany, 28 June 2012.
[11] M. Boyle, “The Effects of Capture Conditions on the CAMSHIFT Face Tracker,” Technical Report, Department of Computer Science, University of Calgary, 2001.
[12] M. Fischler and R. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM, vol. 24, no. 6, pp. 381-395, 1981.
[13] P. Besl and N. D. McKay, “A method for registration of 3-D shapes,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 14, no. 2, pp. 239-256, Feb. 1992.
[14] P. Viola and M. J. Jones, “Robust Real-Time Face Detection,” International Journal of Computer Vision, vol. 57, pp. 137-154, 2004.
[15] R. Lienhart and J. Maydt, “An Extended Set of Haar-like Features for Rapid Object Detection,” in Proc. International Conference on Image Processing, pp. 900-903, 2002.
[16] S. J. Pundlik, D. L. Woodard, and S. T. Birchfild, “Non-Ideal Iris Segmentation Using Graph Cuts,” presented at the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008.
[17] S. Kawato and N. Tetsutani, “Detection and tracking of eyes for gaze-camera control,” in 15th International Conference on Vision Interface, pp. 1031-1038, 2004.
[18] S. Park, E. Aksan, X. Zhang, and O. Hilliges, “Towards end-to-end video-based eye-tracking,” in Proc. of Springer ECCV, 2020.
[19] S. V. Sheela and P. A. Vijaya, “Mapping Functions in Gaze Tracking,” International Journal of Computer Applications, vol. 26, no. 3, pp. 36-42, July 2011.
[20] S. Y. Gwon, C. W. Cho, H. C. Lee, W. O. Lee, and K. R. Park, “Robust eye and pupil detection method for gaze tracking,” International Journal of Advanced Robotic Systems, vol. 10, no. 2, p. 98, 2013.
[21] W. Liu, D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C.-Y. Fu, and A. C. Berg, “SSD: Single shot multibox detector,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 9905 LNCS, pp. 21-37, 2016.
[22] X. Wang, K. Liu, and X. Qian, “A survey on gaze estimation,” in Proc. Int. Conf. Intell. Syst. Knowl. Eng., pp. 260-267, 2015.
[23] Y. Cheng, F. Lu, and X. Zhang, “Appearance-Based Gaze Estimation via EvaluationGuided Asymmetric Regression,” in Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11218 LNCS, pp. 105-121, Munich, Germany, October 2018.
[24] 陳學志、賴惠德、邱發忠(2010)。眼球追蹤技術在學習與教育上的應用。教育科學研究期刊,55(4),39-68。

無法下載圖示 電子全文延後公開
2029/08/02
QR CODE