研究生: |
沈君維 Shen, Chun-Wei |
---|---|
論文名稱: |
利用眼動資料分析閱讀與動態物件凝視行為 Analysis of Reading Behavior and Dynamic Object Fixation with Gaze Tracking Data |
指導教授: |
高文忠
Kao, Wen-Chung |
學位類別: |
碩士 Master |
系所名稱: |
電機工程學系 Department of Electrical Engineering |
論文出版年: | 2018 |
畢業學年度: | 106 |
語文別: | 中文 |
論文頁數: | 90 |
中文關鍵詞: | 閱讀行為 、眼動資料分析 、特徵抽取 、注意力分析 |
英文關鍵詞: | reading behavior, gaze data analysis, feature extraction, attention analysis |
DOI URL: | http://doi.org/10.6345/THE.NTNU.DEE.008.2018.E08 |
論文種類: | 學術論文 |
相關次數: | 點閱:221 下載:3 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
隨著科技的進步,眼動儀發展日漸迅速。眼動儀在人機介面的互動中具有相當高的重要性,眼動儀的取樣率規格需求越來越高,而如何分析所蒐集到的眼動資料,也逐漸成為大家所關注的議題。
本文提出了一套眼動資料自動分析的工具。透過不同條件的特徵抽取,利用支持向量機分類對資料進行分析,實驗結果亦顯示這項工具的可行性。另外本文亦針對使用者眼動資料與觀看物件之間的專注程度進行分析,透過觀看連續影像中的移動物件。除了給予使用者專注力分析,同時以動態及靜態的方式顯示分析結果,將資料以視覺化的方式呈現,以利觀察眼動資料分佈情形。
With the advancement of technology, the eye tracker has developed rapidly. The eye tracker plays an important role in the interaction of the human machine interface. That is, the sampling rate of the eye tracker has been required to be highly accurate, correspondingly, how to analyze the collected gaze data has prompted worldwide concern.
This paper proposes a set of tools for automatic analysis of gaze data. With the feature extraction of different conditions, the Support Vector Machine (SVM) algorithm is used to analyze the data and the evaluation has proved the feasibility of this tool. In addition, this paper also analyzes the degree of focus between the gaze data and the dynamic object. The gaze data is displayed with a dynamic and static video, and the analysis result is presented in a visual figure, which makes it easier to observe the distribution of eye movement data.
[1] K. Zhang, L. Zhang, and M. H. Yang, “Real-time compressive tracking,” in Proc. Eur. Conf. Comput. Vis., Florence, Oct. 2012, pp. 864-877.
[2] D. D. Salvucci and J. H. Goldberg, “Identifying fixations and saccades in eye- tracking protocols,” in Proc. ACM Symp. Eye Tracking Research and Applications, New York, Nov. 2000, pp. 71-78.
[3] J. H. Lin and S. J. Lin, “Integrating eye trackers with handwriting tablets to discover difficulties of solving geometry problems,” J. Educational Technol., vol. 49, no. 1, pp. 17-29, Jan. 2018.
[4] S. C. Chen, M. S. Hsiao, and H. C. She, “The effects of static versus dynamic 3D representations on 10th grade students atomic orbital mental model construction: evidence from eye movement behaviors,” Comput. in Human Behavior, vol. 53, pp. 169-180, Dec. 2015.
[5] L. D. Copeland and T. D. Gedeon, “Tutorials in elearning—how presentation affects outcomes,” IEEE Trans. Emerging Topics in Comput., vol. 5, no. 1, pp. 20-31, Mar. 2017.
[6] B. Sharif, T. Shaffer, J. Wise, and J. I. Maletic, “Tracking developers' eyes in the IDE,” IEEE Trans. Softw., vol. 33, no. 3, pp. 105-108, May-June 2016.
[7] Y. T. Lin, C. C. Wu, T. Y. Hou, Y. C. Lin, F. Y. Yang, and C. H. Chang, “Tracking students’ cognitive processes during program debugging—an eye-movement approach,” IEEE Trans. Education, vol. 59, no. 3, pp. 175-186, Aug. 2016.
[8] K. Kurzhals, M. Burch, T. Pfeiffer, and D. Weiskopf, “Eye tracking in computer-based visualization,” IEEE Trans. Comput. in Science and Engineering, vol. 17, no. 5, pp. 64-71, Sep.-Oct. 2015.
[9] S. Mathe and C. Sminchisescu, “Actions in the eye: dynamic gaze datasets and learnt saliency models for visual recognition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 37, no. 7, pp. 1408-1424, July 2015.
[10] R. Netzel, M. Burch, and D. Weiskopf, “Comparative eye tracking study on node-link visualizations of trajectories,” IEEE Trans. Visualization and Comput. Graphics, vol. 20, no. 12, pp. 2221-2230, Dec. 2014.
[11] M. L. Lai, M. J. Tsai, F. Y. Yang, C. Y. Hsu, T. C. Liu, W. Y. Lee, M. H. Lee, G. L. Chiou, J. C. Liang, and C. C. Tsai, “A review of using eye-tracking technology in exploring learning from 2000 to 2012,” Educational Research Review, vol. 10, pp. 90-115, Dec. 2013.
[12] I. G. Silva, C. T. Lopes, and M. Ellison, “Can we detect English proficiency through reading behavior? a preliminary study,” in Proc. IEEE Conf. Information Systems and Technol., Las Palmas, June 2016, pp. 1-6.
[13] M. Nivala, F. Hauser, J. Mottok, and H. Gruber, “Developing visual expertise in software engineering: an eye tracking study,” in Proc. IEEE Global Engineering Education Conf., Abu Dhabi, Apr. 2016, pp. 613-620.
[14] S. Paracha, T. Takahara, and A. Inoue, “Interactive screening of learning issues via eye-tracking technology,” in Proc. IEEE Learning and Technol. Conf., Jeddah, Apr. 2016, pp. 1-6.
[15] F. Peng, C. Li, X. Song, W. Hu, and G. Feng, “An eye tracking research on debugging strategies towards different types of bugs,” in Proc. IEEE Annual Comput. Software and Applications Conf., Atlanta, June 2016, pp. 130-134.
[16] K. Kurzhals and D. Weiskopf, “Space-time visual analytics of eye-tracking data for dynamic stimuli,” IEEE Trans. Visualization and Comput. Graphics, vol. 19, no. 12, pp. 2129-2138, Dec. 2013.
[17] I. P. Bodala, N. I. Abbasi, Y. Sun, A. Bezerianos, H. A. Nashash, and N. V. Thakor, “Measuring vigilance decrement using computer vision assisted eye tracking in dynamic naturalistic environments,” in Proc. Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society, Seogwipo, July 2017, pp. 2478-2481.
[18] J. Rajesh and P. Biswas, “Eye-gaze tracker as input modality for military aviation environment,” in Proc. IEEE Int. Conf. Intelligent Comput. Instrumentation and Control Technol., Kannur, July 2017, pp. 558-564.
[19] M. Maurus, J. H. Hammer, and J. Beyerer, “Realistic heatmap visualization for interactive analysis of 3D gaze data,” in Proc. ACM Symp. Eye Tracking Research and Applications, New York, Mar. 2014, pp. 295-298.
[20] A. T. Duchowski, “A breadth-first survey of eye-tracking applications,” Behavior Research Methods Instruments and Comput., vol. 34, no. 4, pp. 455-470, Nov. 2002.