簡易檢索 / 詳目顯示

研究生: 吳昇儒
Wu, Sheng-Ju
論文名稱: 自然光源照明眼動儀系統設計
Gaze Tracking System with Natural Light
指導教授: 高文忠
Kao, Wen-Chung
學位類別: 碩士
Master
系所名稱: 電機工程學系
Department of Electrical Engineering
論文出版年: 2015
畢業學年度: 103
語文別: 中文
論文頁數: 77
中文關鍵詞: 眼動儀系統自然光源支持向量機特徵基底凝視點估測技術
英文關鍵詞: Gaze tracker, Natural light, SVM, Featured-based gaze estimation
論文種類: 學術論文
相關次數: 點閱:114下載:6
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 眼動儀系統是藉由量測受測者眼睛的凝視點位置以實現追蹤眼球移動的技術,目前已普遍應用於教育、商業、安全以及醫療等方面。目前眼動儀系統以紅外線照明瞳孔追蹤為大宗。由於陽光光線包含了紅外線波段,在白天戶外操作時會因為影像過曝造成眼球追蹤上之困難。本篇論文提出自然光源照明眼動儀系統,僅藉由消費型網路攝影機即可進行眼動追蹤。自然光源照明眼動儀亦因外界環境光源的多變,影像對比度不如紅外線影像高,有許多挑戰需要克服。
    本論文提出的系統架構共有三大部分,凝視方向辨識訓練、九點校正處理以及即時眼動追蹤。藉由支持向量機(SVM)辨識人眼凝視方向,以及適用於虹膜追蹤的特徵基底凝視點估測技術找出凝視點座標。此系統已經能夠達到13fps的即時處理應用,凝視點的水平方向誤差角度的總體平均值為0.248度,垂直方向誤差角度的總體平均值為0.306度。

    Eye tracking is the process of measuring the point of gaze. An eye tracker is a device for measuring eye positions and eye movement. Eye trackers are used in education, business, security and medical treatment. Most modern eye trackers use the center of the pupil to track the gaze direction by infrared light. It is not suitable to use outdoor because of the image overexposure. We propose an eye tracking system with natural light. The proposed system can measure eye positions and eye movement with webcam. However, it faces several new challenges such as light changes in the environment and lower contrast images.
    The system contains three parts: training gaze direction recognition, process of nine point calibration, and real-time gaze tracking. Using Support Vector Machine (SVM) to recognize gaze direction and find gaze points using featured-based gaze estimation. The system can immediately process under 13 frames/s. The system can achieve the accuracy of 0.248 degree and 0.306 degree for the horizontal and vertical coordinates of the detected gaze center, respectively.

    中文摘要 i 英文摘要 ii 誌  謝 iii 目  錄 iv 圖 目 錄 vi 表 目 錄 xii 第一章 緒論 - 1 - 1.1 眼動追蹤技術發展與應用 - 1 - 1.2 眼動追蹤技術方法介紹 - 3 - 1.3 本文提出的眼動技術目標 - 4 - 1.4 本文研究系統架構 - 5 - 第二章 人眼構造與紅外線眼動儀系統相關研究介紹 - 6 - 2.1 人眼的構造與運動模式 - 6 - 2.1.1 人眼結構 - 6 - 2.1.2 人眼的運動模式 - 7 - 2.2 紅外線眼動儀系統 - 7 - 第三章 自然光源眼動儀系統相關研究探討 - 9 - 3.1 特徵基底凝視點估測(Featured-Based Gaze Estimation) - 10 - 3.2 特徵點凝合圓技術 - 12 - 3.2.1 最小平方法凝合圓 - 12 - 3.2.2 霍夫圓轉換(Circle Hough Transform, CHT) - 13 - 第四章 自然光源照明眼動儀系統架構 - 16 - 4.1 凝視方向辨識訓練 - 17 - 4.2 九點校正處理 - 20 - 4.3 即時眼動追蹤 - 23 - 第五章 眼動追蹤演算法設計 - 24 - 5.1 眼球影像前處理 - 24 - 5.1.1 眼球影像增強 - 24 - 5.1.2 影像雜訊濾除&反光點去除 - 27 - 5.2 閉眼影像偵測 - 28 - 5.2.1 虹膜中心估測 - 28 - 5.2.2 虹膜亮度計算 - 31 - 5.3 凝視方向辨識訓練 - 33 - 5.3.1 DCT特徵抽取 - 33 - 5.3.2 支持向量機 - 34 - 5.4 虹膜中心定位 - 37 - 5.4.1 異色邊緣特徵抽取與虹膜中心定位 - 37 - 5.4.2 凝視方向辨識 - 45 - 5.5 映射曲線計算 - 47 - 5.5.1 九點凝視點分類 - 47 - 5.5.2 二次曲線參數計算 - 48 - 第六章 眼動追蹤實驗結果 - 49 - 6.1 使用者系統操作方法 - 49 - 6.2 凝視點結果分析 - 52 - 6.2.1 眼動追蹤流程結果比較 - 53 - 6.2.2 九點校正結果分析 - 58 - 6.2.3 不同受測者的眼動追蹤結果 - 59 - 6.3 各模組運算時間比較 - 60 - 第七章 結論與未來展望 - 61 - 參考文獻 - 62 - 實驗附錄 - 66 - 自  傳 - 77 -

    [1]陳學志、邱發忠、賴惠德,“眼球追蹤技術在學習與教育上的應用”,科學教育研究學刊,55(4),39-68,2010。
    [2] Wikipedia contributors. "Eye." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 16 Jun. 2015. Web. 22 Jun. 2015.
    [3] Sanders, Mark S., and Ernest J. McCormick. Human factors in engineering and design . McGRAW-HILL book company, 1987.
    [4]交通部國道高速公路局-交通管理組(2012), 102年國道事故檢討分析報告
    [5] D. W. Hansen and Q. Ji. “In the eye of the beholder: a survey of models for eyes and gaze.” IEEE Trans. Pattern Anal. Mach. Intell., vol. 32, no 3, pp. 478–500, Mar. 2010.
    [6] S.M. Kolakowski and J.B. Pelz, “Compensating for eye tracker camera movement,” in Proc. 2006 Symp. Eye Tracking Research and Applicat (ETRA'13), pp. 79–85, 2006.
    [7] Wikipedia contributors. "Eye tracking." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 19 May. 2015. Web. 18 Jun. 2015.
    [8] J. Sigut and S. A. Sidha, “Iris center corneal reflection method for gaze tracking using visible light,” IEEE Trans. Biomed. Eng., vol. 58, no. 2, pp. 411–419, Feb. 2011
    [9] J.G. Wang and E. Sung, “Gaze determination via images of irises,” Image and Vision Computing, vol. 19, no. 12, pp. 891–911, Oct., 2001.
    [10] A. Al-Rahayfeh, and M. Faezipour, “Enhanced eye gaze direction classification using a combination of face detection, CHT and SVM,” in Proc. IEEE Symp. Signal Process. in Medicine and Biology (SPMB'13) , pp. 1–6., New York, Dec. 2013.
    [11] J.G. Wang, E. Sung, and R. Venkateswarlu, “Estimating the eye gaze from one eye,” Comput. Vis. Image Understanding, vol. 98, no. 1, pp. 83–103, Apr. 2005.
    [12] D. Li, D. Winfield, and D. Parkhurst, “Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches,” in Proc. IEEE CVPR Workshop Vision for Human-Computer Interaction (V4HCI), pp. 79–79., Jun. 2005
    [13] W. J. Ryan, D. L. Woodard, A. T. Duchowski, and S. T. Birchfield. “Adapting starburst for elliptical iris segmentation,” in Proc. IEEE Int. Conf. on Biometrics: Theory, Applications, and Systems (BTAS2008), Sep. 2008. , pp.1–7
    [14] M. Kass, A. Witkin, and D. Terzopoulos, “Snakes: Active contour models,” Int. J. Comput. Vis., vol. 1, no. 4, pp. 321–331, Jan. 1988.
    [15]Wikipedia contributors. "Hough transform." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 17 Jun. 2015. Web. 19 Jun. 2015.
    [16] P. Mukhopadhyay and B. B. Chaudhuri, “A survey of Hough transform,”Patt. Recognit., vol. 48, no. 3, pp. 993–1010, Mar. 2015.
    [17]K.-L. Chung, Y.-H. Huang, J.-P. Wang, T.-C. Chang, and H.-Y. M. Liao, “Fast randomized algorithm for center-detection,” Patt. Recognit., vol. 43, no. 8, pp. 2659–2665, Aug. 2010.
    [18] J. Cauchie, V. Fiolet, and D. Villers, “Optimizaiton of an Hough transform algorithm for the search of a center,” Patt. Recognit., vol. 41, no. 2, pp. 567–574, Feb. 2008.
    [19] H. D. Cheng, Y. Guo, and Y. Zhang, “A novel hough transform based on eliminating particle swarm optimization and its applications,” Patt. Recognit., vol. 42, no. 9, pp. 1959–1969, Sep. 2009.
    [20] X. He and P. Shi, “A new segmentation approach for iris recognition based on hand-heldcapture device,” Patt. Recognit., vol. 40, no. 4, pp. 1326–1333, Apr. 2007.
    [21]B. R. Pires, M. Devyver, A. Tsukada, and T. Kanade. “Unwrapping the eye for visible-spectrum gaze tracking on wearable devices.” in IEEE Workshop on the Applicat. of Comput. Vision (WACV’13), pp. 369–376, Clearwater Beach, FL, USA, Jan. 2013.
    [22] B. R. Pires, M. Devyver, A. Tsukada, and T. Kanade, "Visible-spectrum gaze tracking for sports." in IEEE Conf. on Comput. Vision and Patt. Recognit. Workshop (CVPRW), pp. 1005–1010, Portland, Jun. 2013.
    [23] N. Iqbal, H. Lee, and S. Y. Lee, “Smart user interface for mobile consumer devices using model-based eye-gaze estimation,” IEEE Trans. Consum. Electron., vol. 59, no. 1, pp. 161–166, Feb. 2013.
    [24]S. J. Baek, K. A. Choi, C. Ma,Y. H. Kim, and S. J. Ko, "Eyeball model-based iris center localization for visible image-based eye-gaze tracking systems." IEEE Trans. Consum. Electron., vol. 59, no.2 pp. 415–421, May 2013.
    [25] H. Wu, Y. Kitagawa, T. Wada, T. Kato, and Q. Chen, “Tracking iris contour with a 3D eye-model for gaze estimation,” in Proc. of Asian Conf. on Comput. Vision (ACCV’07), vol. 1, pp. 688–697, 2007
    [26] Wen-Chung Kao, Xiang-Ting Huang, Hung-Chun Wang, Chih-Chen Pan, and Feng-Che Yang, " Real-time tone reproduction for video recording," in Proc. IEEE Int. Symp. Consumer Electronics (ISCE), Harrisburg, US, June 2012.
    [27] J. Canny, “A computational approach to edge detection,” IEEE Trans. on Pattern Anal. Mach. Intell., vol. 8, no. 6, pp. 679–698, Nov. 1986.
    [28]Wikipedia contributors. "Sobel operator." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 21 Jun. 2015. Web. 26 Jun. 2015.
    [29] C. Hennessey, B. Noureddin, and P. Lawrence, “Fixation precision in highspeed noncontact eye-gaze tracking,” IEEE Trans. Syst., Man, Cybern. B, Cybern., vol. 38, no. 2, pp. 289–298, Apr. 2008.
    [30] Q. Ji and Z. Zhu, “Eye and gaze tracking for interactive graphic display,” Proc. Second Int’l Symp. Smart Graphics, pp. 79–85, Hawthorne, NY, 2002.
    [31] N. Ramanauskas, “Calibration of video-oculographical eye-tracking system,” Electron. Electr. Eng., vol. 8, no. 72, pp. 65–68, 2006.
    [32] Z. Cherif, A. N. -Ali, J. Motsch, and M. Krebs, “An adaptive calibration of an infrared light device used for gaze tracking,” in Proc. IEEE Instrum. Meas. Tech. Conf., vol. 2, pp. 1029–1033, 2002

    下載圖示
    QR CODE