簡易檢索 / 詳目顯示

研究生: 陳冠安
Chen, Guan-An
論文名稱: 顯著差異感興趣區域探勘之方法應用於眼動資料分析
Method for Mining Significant Differences in Regions of Interest Applied to Eye Movement Data Analysis
指導教授: 何宏發
Ho, Hong-Fa
學位類別: 碩士
Master
系所名稱: 電機工程學系
Department of Electrical Engineering
論文出版年: 2016
畢業學年度: 104
語文別: 中文
論文頁數: 110
中文關鍵詞: 眼動儀眼動資料有效性眼動資料修正顯著差異感興趣區域探勘
英文關鍵詞: Eye tracker, Inspection of the Validation of Eye Movement Data, Correction of Shifted Eye Movement Data, Mining Significant Differences in Regions of Interests
DOI URL: https://doi.org/10.6345/NTNU202203574
論文種類: 學術論文
相關次數: 點閱:190下載:8
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 行為科學研究包含閱讀歷程、情緒、意向、安全駕駛……等課題,眼動分析均具有高度應用價值,特別在於如何提出正確假設並有效尋找顯著差異區域實屬不易。
    本顯著差異感興趣區域探勘分析之方法應用於自動化眼動資料分析主要分為三個模組,「數據有效性判定模組」可以自動找出眼動資料是否是因為頭部偏移、眼動儀校正過程,所導致之無效數據樣本判定;「數據修正模組」是基於多數眼動實驗的刺激物都會以較大的方式呈現在螢幕正中間,並假設受測者不會凝視空白區域為前提所提出之方法,該模組可自動針對較大幅度的數據偏移進行修復;「顯著差異感興趣區域分析模組」是基於各個擊破法進行感興趣區域切割與定義,自動尋找刺激物中呈現顯著差異的區域。
    實驗結果顯示: 「數據有效性判定模組」中提出三項機制進行眼動資料有效性檢測,以一內含50筆的眼動資料有效性判定為例,依照不同的數據有效性判定機制花費1ms~78.8ms;眼動資料修正前的無效眼動資料分布百分率為100%,經過「數據修正模組」處理後的無效眼動資料分布百分率為0%,平均處理一張刺激物的眼動資料偏移修正約1秒鐘;「顯著差異感興趣區域分析模組」可以執行兩種統計分析(獨立樣本t檢定、成對樣本t檢定),於兩群受測者的眼動資料找尋眼動變項(ex: 首次凝視時間、總凝視時間等)中的顯著差異感興趣區域。例如: 圖片型刺激物的實驗,32位受測者、每位看4張刺激物、4種尺寸(4格、16格、64格及256格)切割的顯著差異探勘花費4.92分鐘;影片型刺激物的實驗,5位受測者、每位看20秒的刺激物、3種尺寸(4格、16格及64格)切割的顯著差異探勘花費29.38分鐘。
    本顯著差異感興趣區域探勘分析之方法應用於自動化眼動資料分析解決原先必須由研究人員從事的繁瑣、耗時之分析流程,並提供相關領域研究人員日後在進行研究時,感興趣區域定義之參考。此外顯著差異感興趣區域分析模組之感興趣區域切割與定義方式並不涉及任何主觀意識,更能符合科學研究的客觀要素。

    In the eye movement experiment, it is a time-consuming and exhausted task to find the significant differences between eye movement variables and independent variables in the experiment design. First, researchers needed to determine which data sets are valid or invalid. Second, definition of Regions of Interest (ROIs) or Areas of Interest (AOIs) is done according to their research hypotheses and is not automated. Then, there is a need of importing of eye movement data into the system in order to get values for variables such as the Total Contact Time, Number of Fixation, and so forth. After which, the statistical analysis is done via statistical software, such as SPSS and Microsoft Excel. The process is very tedious and time-consuming, specially, when significant differences are not found, researches need to redefine the ROIs, start over, and do the above procedure repeatedly.
    In order to address the mentioned difficulties, this study discusses on the method for mining significant differences in regions of interest applied to the analysis of eye movement data. There are three modules, namely Inspection of the Validation of Eye Movement Data, Correction of Shifted Eye Movement Data, and Analysis of Significant Differences in ROIs.
    Inspection of the Validation of Eye Movement Data module assists researchers to distinguish which eye movement data are valid or not. Results show that it took 1 millisecond~78.8 milliseconds to check the validation of one eye movement data including 50 raw data.
    The Correction of Shifted Eye Movement Data module was for the stimulus to be as big as possible and centered on the screen on the assumption that the participants would not gaze at this region in order to correct the shifted eye movement data. Correction of one eye movement data took an approximate time of 1 millisecond.
    Divide and conquer approach was applied in the Analysis of Significant Differences in ROIs module design. Its purpose is to find all potential significant differences among ROIs. It supports two kind of statistical analysis (Two-Samples t-test and Paired t-test). It tried to find all significant differences in ROIs. There were 64 participants and were divided into two groups. Each participant gazed at 4 stimuli. In this case, it took 4.92 minutes to run this significant difference mining. In another case, there were 10 participants divided into two groups. Each participant gazed at video for 20 seconds. It took around 29.38 minutes to run the significant difference mining.
    Method for mining significant differences in regions of interest applied to eye movement data analysis tried to provide tools that are more convenient and replace tedious procedures. Finally yet importantly, it also provided recommendation on better definition of ROIs to the researchers.

    摘 要 I ALL ABSTRACT III 誌 謝 VI 目 錄 VII 圖目錄 X 表目錄 XII 第一章 緒論 14 1.1 研究背景與動機 14 1.2 研究目的與方法 16 1.3 論文架構 17 1.4 本研究相關名詞定義 17 第二章 相關研究與文獻探討 19 2.1 眼動儀簡介 19 2.2 眼動資料分析 20 2.2.1 視覺化數據分析 20 2.2.2 常用統計型眼動變項 22 2.3 統計分析- t-test 24 2.3.1 t-test 起源[26] 24 2.3.2 t-test 種類[29, 30] 25 2.4 數據有效性判定與數據修正 26 2.5 顯著差異感興趣區域分析 27 第三章 顯著差異感興趣區域探勘演算法設計 28 3.1 EyeNTNU眼動儀系統與本研究之關係 28 3.2 數據有效性判定模組 29 3.2.1 問題描述 29 3.2.2 演算法構想 30 3.2.3 演算法規劃與設計 32 3.2.4 演算法分析 36 3.3 數據修正模組 37 3.3.1 問題描述 37 3.3.2 演算法構想 37 3.3.3 演算法規劃與設計 39 3.3.4 演算法分析 42 3.4 顯著差異感興趣區域分析模組 43 3.4.1 問題描述 43 3.4.2 演算法構想 43 3.4.3 演算法規劃與設計 44 3.4.4 演算法分析 50 第四章 眼動儀數據分析系統之實作 51 4.1 開發工具與環境介紹 51 4.2 使用到之軟、硬體 51 4.2.1 眼動儀硬體設備 52 4.2.2 軟體 53 4.3 數據有效性判定模組 56 4.4 數據修正模組 57 4.5 顯著差異感興趣區域分析模組 59 第五章 實驗與評量 62 5.1 數據有效性判定模組準確性與效率分析 64 5.2 數據修正模組準確性與效率分析 75 5.3 顯著差異感興趣區域分析模組準確性與效率分析 85 5.3.1 圖片型刺激物實際測試-A study on teaching debugging strategies for digital circuit 85 5.3.2 影片型刺激物實際測試-自閉症研究 94 第六章 結論與未來展望 101 參考文獻 103 自傳 106 學術成就 106

    [1] G. S. M. Chwo, H. F. Ho, B. C. Y. Liu, and S. W. L. Chiu, "Using eye-tracking as a means to evaluate visual and content design choices in web 2.0-An initial finding from Livemocha," in UHAMKA PRESS, 2013.
    [2] S. M. Thang, N. M. Jaafar, H. F. Ho, G. A. Chen, and O. K. Soh, "Using eye tracking to investigate strategies used by English as Second Language learners in reading a scientific text with diagram," in International Workshop on Learning Analytics, Technology Adoption, and Language Learning in the Big-Data Era, 2015.
    [3] S. Kim, A. K. Dey, J. Lee, and J. Forlizzi, "Usability of car dashboard displays for elder drivers," presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 2011.
    [4] M. V. O'Shea, "The Psychology and Pedagogy of Reading," The journal of philosophy, psychology and scientific methods, vol. 5, p. 500, 1908.
    [5] G. T. Buswell, How people look at pictures: University of Chicago Press Chicago, 1935.
    [6] E. Javal, "Essai sur la physiologie de la lecture," in Annales D\'Oculistique, 1879.
    [7] E. Javal, K. J. Ciuffreda, and N. Bassil, "Essay on the physiology of reading*," Ophthalmic and physiological optics, vol. 10, pp. 381-384, 1990.
    [8] M. A. Just and P. A. Carpenter, "A theory of reading: from eye fixations to comprehension," Psychological review, vol. 87, p. 329, 1980.
    [9] M. A. Just and P. A. Carpenter, "Eye fixations and cognitive processes," Cognitive psychology, vol. 8, pp. 441-480, 1976.
    [10] H. F. Ho, "The effects of controlling visual attention to handbags for women in online shops: Evidence from eye movements," Computers in Human Behavior, vol. 30, pp. 146-152, 1// 2014.
    [11] M. G. Calvo and P. J. Lang, "Gaze Patterns When Looking at Emotional Pictures: Motivationally Biased Attention," Motivation and Emotion, vol. 28, pp. 221-243, 2004.
    [12] J. Hewig, R. H. Trippe, H. Hecht, T. Straube, and W. H. R. Miltner, "Gender Differences for Specific Body Regions When Looking at Men and Women," Journal of Nonverbal Behavior, vol. 32, pp. 67-78, 2008.
    [13] H. F. Ho, G. A. Chen, and C. Vicente, "Impact of Misplaced Words in Reading Comprehension of Chinese Sentences: Evidences from Eye Movement and Electroencephalography " in The 23rd International Conference on Computers in Education (ICCE 2015), 2015.
    [14] J. V. Singh and G. Prasad, "Enhancing an Eye-Tracker based Human-Computer Interface with Multi-modal Accessibility Applied for Text Entry," International Journal of Computer Applications, vol. 130, 2015.
    [15] T. Group. (2016). Eye Tracker solded by Tobii. Available: http://www.tobii.com/
    [16] S. Instrument. (2016). Products sold by SMI. Available: http://www.smivision.com/en/gaze-and-eye-tracking-systems/products/overview.html
    [17] M. Vernet and Z. Kapoula, "Binocular motor coordination during saccades and fixations while reading: A magnitude and time analysis," Journal of Vision, vol. 9, pp. 2-2, 2009.
    [18] 朱瀅, 實驗心理學, 2002.
    [19] M. Nyström and K. Holmqvist, "An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data," Behavior research methods, vol. 42, pp. 188-204, 2010.
    [20] A. T. Duchowski, Eye Tracking Methodology: Theory and Practice, 2003.
    [21] D. D. Salvucci and J. H. Goldberg, "Identifying fixations and saccades in eye-tracking protocols," in Proceedings of the 2000 symposium on Eye tracking research & applications, Palm Beach Gardens, Florida, USA, 2000, pp. 71-78.
    [22] A. T. Bahill, A. Brockenbrough, and B. T. Troost, "Variability and development of a normative data base for saccadic eye movements," Investigative Ophthalmology & Visual Science, vol. 21, pp. 116-125, 1981.
    [23] S. RESEARCH, EyeLink user manual. Ottawa, 2007.
    [24] A. Aula, P. Majaranta, and K.-J. Räihä, "Eye-tracking reveals the personal styles for search result evaluation," in Human-Computer Interaction-INTERACT 2005, ed: Springer, 2005, pp. 1058-1061.
    [25] S. Djamasbi, T. Tullis, J. Hsu, E. Mazuera, K. Osberg, and J. Bosch, "Gender preferences in web design: usability testing through eye tracking," in Proceedings of the Thirteenth Americas Conference on Information Systems (AMCIS), 2007, p. 1.
    [26] 劉仁沛, "啤酒與學生氏t檢定," 科學發展, vol. 487, pp. 68-70, 2013.
    [27] STUDENT, "ON THE ERROR OF COUNTING WITH A HAEMACYTOMETER," Biometrika, vol. 5, pp. 351-360, February 1, 1907 1907.
    [28] STUDENT, "PROBABLE ERROR OF A CORRELATION COEFFICIENT," Biometrika, vol. 6, pp. 302-310, September 1, 1908 1908.
    [29] 柴惠敏. (2005). 獨立 t 檢定的概念. Available: http://www.pt.ntu.edu.tw/hmchai/SAS/SAScontinuous/SASttest.htm
    [30] 謝寶煖, "比較平均數 Test," in 量化研究與統計分析, ed, 2006.
    [31] 于莉莉, 夏结来, and 蒋红卫, "几种简单设计的等效性检验的样本量与检验效能," 第四军医大学学报, vol. 25, pp. 1045-1049, 2004.
    [32] 韓承靜 、 蔡介立, "眼球軌跡記錄—科學學習研究的明日之星," 科學教育月刊, vol. 310, pp. 2-11, 2008.
    [33] J. M. Henderson and A. Hollingworth, "High-level scene perception," Annual review of psychology, vol. 50, pp. 243-271, 1999.
    [34] K. Rayner, C. Rotello, A. Stewart, J. Keir, and S. Duffy, "Integrating text and pictorial information: Eye movements when looking at print advertisements," Journal of experimental psychology. Applied, vol. 7, pp. 219-226, 2001.
    [35] 唐大崙 、 莊賢智, "由眼球追蹤法探索電子報版面中圖片位置對注意力分佈之影響," 廣告學研究, pp. 89-104, 2005.
    [36] A. Klin, W. Jones, R. Schultz, F. Volkmar, and D. Cohen, "VIsual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism," Archives of General Psychiatry, vol. 59, pp. 809-816, 2002.
    [37] H. F. Ho and D. H. Huang, "A study on teaching debugging strategies for digital circuit," in Asia-Pacific Society for Computers in Education, 2014.

    下載圖示
    QR CODE