研究生: |
邱奕欽 Chiu, Yi-Chin |
---|---|
論文名稱: |
個人化眼球模型建構與高速虹膜匹配設計 Customized Eyeball Model Construction and High Speed Iris Matching |
指導教授: |
高文忠
Kao, Wen-Chung |
學位類別: |
碩士 Master |
系所名稱: |
電機工程學系 Department of Electrical Engineering |
論文出版年: | 2020 |
畢業學年度: | 108 |
語文別: | 中文 |
論文頁數: | 70 |
中文關鍵詞: | 眼動儀 、可見光 、平行化架構 、高速 |
英文關鍵詞: | gaze tracker, visible light, parallel computing architecture, high speed |
DOI URL: | http://doi.org/10.6345/NTNU202000340 |
論文種類: | 學術論文 |
相關次數: | 點閱:186 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
可見光眼動儀的核心技術是在眼睛影像中框取虹膜輪廓。系統的計算複雜度源自於分析眼睛圖像中可能呈現的不同虹膜形狀和不同照明條件下的影像。本論文中提出三個方法:一、眼球模型建構方法和改進的虹膜匹配方程式,藉此提高系統精準度和精密度。二、階層式搜尋用於即時的眼球模型與眼睛影像的匹配。三、在多核心微處理器上,本系統實現了高精密度的平行化凝視點估計系統架構。實驗結果改善眼球模型建構明顯提升了眼球模型匹配的精密度、階層式搜尋可以快速地在眼睛影像中匹配出虹膜並估計眼球旋轉角與所提出的平行化系統架構可以用於眼動儀系統當中並實現計算高於4000幀/秒的速度。
The core technology of visible-spectrum gaze tracker (VSGT) is the determination of the limbus circle on the eye image. The high complexity of analyzing eye images comes from various projection shapes of the eyeball and illumination conditions. In this thesis, we proposed an eyeball model construction method and refined iris matching equation to improve the system accuracy and the precision. In addition, a parallel computing architecture incorporating with a hierarchical search scheme for real-time limbus circle matching has been presented. The experimental result shows that the proposed model construction method significantly improves the overall eyeball model matching performance. The proposed hierarchical search scheme efficiently determines the optimal match model for an input eye image, and the proposed parallel computing architecture could be applied to the design of a high speed VSGT with the frame rate higher than 4000 frames/s.
[1] W. C. Kao, C. Y. Lee, C. Y. Lin, T. Y. Su, B. Y. Ke, and C. Y. Liao, “Gaze tracking with particle swarm optimization,” in Proc. IEEE Int. Symp. on Consumer Electronics, May 2015
[2] S. Baluja and D. Pomerleau, “Non-intrusive gaze tracking using artificial neural networks,” Sch. Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA, Tech. Rep. CMU-CS-94-102, 1994.
[3] L. Xu, D. Machin, and P. Sheppard, “A novel approach to real-time non-intrusive gaze finding,” in Proc. the British Machine Vision Conf. 1998, 1998.
[4] K. Tan, D. Kriegman, and N. Ahuja, “Appearance-based eye gaze estimation,” in Proc. Sixth IEEE Workshop Applications of Computer Vision, 2002, pp. 191-195.
[5] D.W. Hansen and A.E.C. Pece, “Eye tracking in the wild,” Computer Vision and Image Understanding, vol. 98, no. 1, pp. 182- 210, Apr. 2005.
[6] Y. Sugano, Y. Matsushita, and Y. Sato, “Calibration-free gaze sensing using saliency maps,” in Proc. IEEE Conf. CVPR, Jun. 2010, pp. 2667–2674.
[7] F. Lu, Y. Sugano, T. Okabe, and Y. Sato, “Inferring human gaze from appearance via adaptive linear regression,” in Proc. Int. Conf. Comput. Vis., 2011, pp. 153–160.
[8] X. Zhang, Y. Sugano, M. Fritz, and A. Bulling, “Appearance-based gaze estimation in the wild,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2015, pp. 4511–4520.
[9] K. Krafka, A. Khosla, P. Kellnhofer, H. Kannan, S. Bhandarkar, W. Matusik, and A. Torralba, “Eye Tracking for Everyone,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2016, pp. 2176–2184.
[10] F. Lu, Y. Sugano, T. Okabe, and Y. Sato, “Gaze estimation from eye appearance: a head pose-free method via eye image synthesis,” IEEE Trans. Image Proces., vol. 24, no. 11, pp. 3680–3693, 2015.
[11] F. Lu, Y. Sugano, T. Okabe and Y. Sato, "Head pose-free appearance-based gaze sensing via eye image synthesis," in Proc. ICPR, 2012, pp. 1008-1011.
[12] Y. Sugano, Y. Matsushita, and Y. Sato, “Learning-by-synthesis for appearance-based 3d gaze estimation,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2014, pp. 1821–1828.
[13] E. Wood, T. Baltrušaitis, L. P. Morency, P. Robinson, and A. Bulling, “A 3D morphable eye region model for gaze estimation,” in Proc. ECCV, pp. 297–313, 2016.
[14] E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Trans. Biomed. Eng., vol. 53, no. 8, pp. 1728–1728, 2006.
[15] D. Beymer and M. Flickner, “Eye gaze tracking using an active stereo head,” in Proc. Int. Conf. Computer Vision and Pattern Recognition, 2003, pp. 451–458.
[16] T. Ohno and N. Mukawa, “A free-head, simple calibration, gaze tracking system that enables gaze-based interaction,” in Proc. ETRA, 2004, pp. 115-122.
[17] C. Hennessey, B. Noureddin, and P. Lawrence, “A single camera eye-gaze tracking system with free head motion,” in Proc. ETRA, 2006, pp. 87–94.
[18] L. Sun, Z. Liu, and M. T. Sun, “Real time gaze estimation with a consumer depth camera,” Information Sciences, vol. 320, no. C, pp. 346–360, 2015.
[19] S. J. Baek, K. A. Choi, C. Ma, Y. H. Kim, and S. J. Ko, “Eyeball model-based iris center localization for visible image-based eye-gaze tracking systems” IEEE Trans. Consumer Electronics, vol. 59, no. 2, pp. 415-421, May 2013.
[20] P. Blignaut, ‘‘Mapping the pupil-glint vector to gaze coordinates in a simple video-based eye tracker,’’ J. Eye Movement Res., vol. 7, no. 1, pp. 1–11, 2013.
[21] Z. Cherif, A. Nait-Ali, J. Motsch, and M. Krebs, “An adaptive calibration of an infrared light device used for gaze tracking,” in Proc. IEEE Instrum. Meas. Tech. Conf., 2002, pp. 1029–1033.
[22] J. J. Cerrolaza, A. Villanueva, and R. Cabeza, “Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems,” in Proc. 2008 Symp. Eye Tracking Res. Appl. New York: ACM, pp. 259–266..
[23] C. Ma, K.A. Choi, B.D. Choi, and S.-J. Ko, “Robust remote gaze estimation method based on multiple geometric transforms,” Opt. Eng., vol. 54, no. 8, 083103, Oct. 2015.
[24] Z. Zhu, Q. Ji, and K. P. Bennett, “Nonlinear eye gaze mapping function estimation via support vector regression,” in Proc. 18th Int. Conf. Pattern Recog., 2006, vol. 1, pp. 1132–1135.
[25] “Eye tracking research in various fields - Tobii Pro,” Eye tracking research in various fields - Tobii Pro, 03-Jun-2015. [Online]. Available: https://www.tobiipro.com/fields-of-use/. [Accessed: 10-Feb-2020].