簡易檢索 / 詳目顯示

研究生: 李珮綺
LI, Pei-Chi
論文名稱: 運用錄影面試動態表情結合深度學習預測臺灣國際產業移工之留任意願:以卷積神經網絡為工具
Using Dynamic Facial Expression Enables with Deep Learning from Pre-recorded Interviews to Predict Taiwan Industrial Migrant Workers’ Intention to Stay at Work:Based on CNN-Regression
指導教授: 孫弘岳
Suen, Hung-Yue
口試委員: 陳建丞
Chen, Chien-Cheng
蕭顯勝
Hsiao, Hsien-Sheng
孫弘岳
Suen, Hung-Yue
口試日期: 2023/06/14
學位類別: 碩士
Master
系所名稱: 科技應用與人力資源發展學系
Department of Technology Application and Human Resource Development
論文出版年: 2024
畢業學年度: 112
語文別: 中文
論文頁數: 85
中文關鍵詞: 人工智慧視訊面試微表情弱表情情感運算留任意願
英文關鍵詞: Artificial Intelligence, Video Interviews, Micro-Expressions, Subtle Expressions, Sentiment Analysis, Intention to Stay
研究方法: 準實驗設計法實證研究法
DOI URL: http://doi.org/10.6345/NTNU202400156
論文種類: 學術論文
相關次數: 點閱:125下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 臺灣於2018年3月進入高齡社會階段,65歲以上的老年人口超過全人口的14%,勞力短缺問題逐漸加劇,國際移工成為支撐臺灣勞動力的不可或缺的一環。然而,國際移工在抵達臺灣後常常面臨失聯或怠惰等問題,且這些問題隨著時間的推移變得更加嚴重。臺灣對於移工失聯的法律約束不夠完善且程序繁瑣。因此,人力顧問公司希望在面試階段能夠篩選出願意留任的國際移工,以確保雇主能夠維持穩定的留任率。
    在心理學領域,隨著電腦視覺(Computer Vision)與深度學習(Deep Learning)技術的成熟發展,科技與心理領域的跨學科研究越來越多。許多學者開始合作,利用視訊錄影影片辨識當事人的動態表情,進而推測其情緒甚至未來的行為。本研究深度學習技術,即卷積神經網絡(Convolutional Neural Network, CNN),進行實證研究。研究對象為81位個案派遣公司所派遣的菲律賓和越南國籍產業移工,透過電腦視覺技術收集國際產業移工在特定情境下回答問題時所展現的面部動態表情軌跡,並利用卷積神經網絡建立動態表情與留任意願之間的模型預測他們的留任意願,為臺灣的移工雇主和派遣公司提供了一個快速而具有預測力的決策輔助工具,幫助他們在招募和甄選過程中做出明智的選擇。

    Taiwan is facing a growing labor shortage and has increasingly turned to international migrant workers to address this issue. However, there have been challenges associated with these workers, including cases of disappearance and low motivation after their arrival. The complex legal constraints and procedures have hindered the effective resolution of these problems. Consequently, it has become crucial to identify migrant workers who have a strong intention to stay during the interview process.
    This research aims to tackle this issue through the application of artificial intelligence, video interviews, dynamic expressions, sentiment analysis, and intention-to-stay assessment. We employed Convolutional Neural Network (CNN) to analyze the dynamic facial expressions of 81 international migrant workers from various case companies, utilizing an asynchronous video interview system. By capturing these workers' dynamic facial expression trajectories as they answered questions in specific scenarios, we developed a CNN-based model to predict their willingness to stay. This model serves as a practical solution to Taiwan's labor shortage problem.
    The results indicate that computer vision technology can effectively predict the intention to stay of international migrant workers during the interview stage. This predictive decision support tool offers valuable assistance to employers and dispatch companies in Taiwan, enabling them to recruit and select workers more efficiently and effectively.
    In conclusion, this study highlights the potential of utilizing artificial intelligence and computer vision technology to address Taiwan's labor shortage problem by predicting the intention to stay of international migrant workers during the interview stage. This tool facilitates the efficient and effective recruitment and selection of international migrant workers in Taiwan, contributing to a more sustainable labor force.

    第一章 緒 論 1 第一節 研究背景與動機 1 第二節 研究目的與待答問題 5 第三節 名詞解釋 6 第二章 文獻探討 9 第一節 留任意願(Intention to Stay) 9 第二節 動態表情(Dynamic Facial Expressions)11 第三節 留任意願與動態表情 15 第三章 研究設計與實施 17 第一節 研究架構與假設 17 第二節 研究對象與方法 17 第三節 研究流程 20 第四節 研究工具 25 第五節 資料處理與方法 31 第四章 研究結果 33 第一節 敘述性統計分析與探索性因素分析 33 第二節 問卷與模型預測結果與分析 40 第三節 國際移工留任意願模型建立 42 第五章 結論與建議 47 第一節 研究發現 47 第二節 實務建議 49 第三節 理論貢獻 50 第四節 研究限制與未來建議 51 第五節 結論 53 參考文獻 55 附 錄 71

    中華民國內政部(2022)。年度縣市及全國統計資料。內政部戶政司全球資料網。https://www.ris.gov.tw/app/portal/346
    中華民國內政部(2022)。失聯移工統計表。中華民國內政部移民署全球資訊網。內政部移民署。https://www.immigration.gov.tw/5385/7344/7350/8943/?alias=settledown
    范裕康(2005)。誰可以成為外勞? 移工的招募與篩選(未出版碩士論文)。國立台灣大學。
    孫弘岳(2019年3月)。AI面試官看穿你的心。能力雜誌,757,20–24。
    孫志軍、薛磊、許陽明、王正(2012)。深度學習研究综述。計算機應用研究,29(8),2806-2810。https://doi.org/10.3969/j.issn.1001-3695.2012.08.002
    趙西萍、劉玲、張長征(2003)。員工離職傾向影響因素的多變量分析。中國軟科學,(3),71-74。https://doi.org/10.3969/j.issn.1002-9753.2003.03.015
    蔡欣嵐 (2001)。工作特性、人格特質與工作滿意度之關係-以半導體業為例」(未出版碩士論文)。國立中央大學。
    Ariyabuddhiphongs, V., & Marican, S. (2015). Big five personality traits and turnover intention among Thai Hotel employees. International Journal of Hospitality & Tourism Administration, 16(4), 355–374. https://doi.org/10.1080/15256480.2015.1090257
    Asthana, A., Goecke, R., Quadrianto, N., & Gedeon, T. (2009, 20-25 June 2009). Learning-based automatic face annotation for arbitrary poses and expressions from frontal images only. 2009 IEEE Conference on Computer Vision and Pattern Recognition, from https://ieeexplore.ieee.org/document/5206766
    Asthana, A., Zafeiriou, S., Cheng, S., & Pantic, M. (2013). Robust discriminative response map fitting with constrained local models. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 3444–3451. https://openaccess.thecvf.com/content_cvpr_2013/html/Asthana_Robus t_Discriminative_Response_2013_CVPR_paper.html
    Barrett, L. F., Adolphs, R., Marsella, S., Martinez, A. M., & Pollak, S. D. (2019). Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychological Science in the Public Interest, 20(1), 1-68. https://doi.org/10.1177/1529100619832930
    Bozogáňová, M., & Ivan, O. (2022). Personality traits in relation to the turnover intentions of the qualified employees in the manufacturing industry. Človek a Spoločnosť, 21(2). 56-67. https://doi.org/10.31577/cas.2018.02.527
    Carcagnì, P., Del Coco, M., Leo, M., & Distante, C. (2015). Facial expression recognition and histograms of oriented gradients: A comprehensive study. SpringerPlus, 4(1), 1-25. https://doi.org/10.1186/s40064-015-1427-3
    Celiktutan, O., & Gunes, H. (2015). Automatic prediction of impressions in time and across varying context: Personality, attractiveness and likeability. IEEE Transactions on Affective Computing, 8(1), 29-42. https://doi.org/10.1109/TAFFC.2015.2513401
    Chu, HC., Tsai, W. WJ., Liao, MJ., & Chen, YM. (2018). Facial emotion recognition with transition detection for students with high-functioning autism in adaptive e-learning. Soft Comput, 22, 2973-2999. https://doi.org/10.1007/s00500-017-2549-z
    Clarke, H. F., Laschinger, H. S., Giovannetti, P., Shamian, J., Thomson, D., & Tourangeau, A. (2001). Nursing shortages: Workplace environments are essential to the solution. Hospital Quarterly, 4(4), 50-57. https://doi.org/10.12927/hcq..17434
    Crivelli, C., & Fridlund, A. J. (2018). Facial displays are tools for social influence. Trends in Cognitive Sciences, 22(5), 388–399. https://doi.org/10.1016/j.tics.2018.02.006
    Crivelli, C., Carrera, P., & Fernández-Dols, J. M. (2015). Are smiles a sign of happiness? Spontaneous expressions of judo winners. Evolution and Human Behavior, 36(1), 52–58. https://doi.org/10.1016/j.evolhumbehav.2014.08.009
    Curry, J. P., Wakefield, D. S., Price, J. L., & Mueller, C. W. (1986). On the causal ordering of job satisfaction and organizational commitment. Academy of Management Journal, 29(4), 847-858. https://doi.org/10.5465/255951
    Dalal, N., & Triggs, B. (2005). Histograms of oriented gradients for human detection. IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA https://doi.org/10.1109/CVPR.2005.177.
    Duan, Y., Edwards, J. S., & Dwivedi, Y. K. (2019). Artificial intelligence for decision making in the era of Big Data – evolution, challenges and research agenda. International Journal of Information Management, 48, 63–71. https://doi.org/10.1016/j.ijinfomgt.2019.01.021
    Donnellan, M. B., Oswald, F. L., Baird, B. M., & Lucas, R. E. (2006). The mini-IPIP scales: tiny-yet-effective measures of the Big Five factors of personality. Psychological assessment, 18(2), 192-193. https://doi.org/10.1037/1040-3590.18.2.192
    Dziuban, C. D., & Shirkey, E. C. (1974). When is a correlation matrix appropriate for factor analysis? Some decision rules. Psychological Bulletin, 81(6), 358–361. https://doi.org/10.1037/h0036316
    Ekman, P. (1992). Facial expressions of emotion: an old controversy and new findings. Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences, 335(1273), 63-69. https://doi.org/10.1098/rstb.1992.0008
    Ekman, P. (1993). Facial expression and emotion. American Psychologist, 48(4), 384-385. https://doi.org/10.1037/0003-066X.48.4.384
    Ekman, P. (2003). Introduction. Annals of the New York Academy of Sciences, 1000(1), 1-6. https://doi.org/https://doi.org/10.1196/annals.1280.002
    Ekman, P. (2009). Lie catching and micro expressions. In Clancy Martin (Ed.), The Philosophy of Deception (pp. 118-136). Oxford Academic. https://doi.org/10.1093/acprof:oso/9780195327939.003.0008
    Ekman, P., & Friesen, W. V. (1978). Facial action coding system (FACS) [Database record]. APA PsycTests. https://doi.org/10.1037/t27734-000
    Ekman, P., & Friesen, W. V. (2010). The repertoire of nonverbal behavior: Categories, origins, usage, and coding. In A. Kendon (Ed.), Nonverbal communication, interaction, and gesture: Selections from semiotica (pp. 57–106). De Gruyter Mouton. https://doi.org/10.1515/9783110880021.57
    Ekman, P., & O'Sullivan, M. (2006). From flawed self‐assessment to blatant whoppers: The utility of voluntary and involuntary behavior in detecting deception. Behavioral sciences & the law, 24(5), 673-686. https://doi.org/10.1002/bsl.729
    Fellner, A., Matthews, G., Funke, G. J., Emo, A. K., Pérez-González, J. C., Zeidner, M., & Roberts, R. D. (2007). The effects of emotional intelligence on visual search of emotional stimuli and emotion identification. In Proceddings of the Human Factors and Erogonomics Society 51st Annual Meeting (pp. 845–849). Santa Monica, CA, USA: HEFS. https://doi.org/10.1177/154193120705101402
    Funder, D. C. (2008). Persons, situations, and person-situation interactions. In O. P. John, R. W. Robins, & L. A. Pervin (Eds.), Handbook of personality: Theory and research (pp. 568–580). The Guilford Press.
    Goffman, E. (2021). The Presentation of Self in Everyday Life, Knopf Doubleday Publishing Group.
    Green, P. C., Alter, P., & Carr, A. F. (1993). Development of standard anchors for scoring generic past‐behaviour questions in structured interviews. International Journal of Selection and Assessment, 1(4), 203–212. https://doi.org/10.1111/j.1468-2389.1993.tb00114.x
    Haggard, E. A., & Isaacs, K. S. (1966). Micromomentary facial expressions as indicators of ego mechanisms in psychotherapy. In L. A. Gottschalk & A. H. Auerbach (Eds.), Methods of Research in Psychotherapy (pp. 154-165). Springer US. https://doi.org/10.1007/978-1-4684-6045-2_14
    Hammal, Z., Couvreur, L., Caplier, A., & Rombaut, M. (2007). Facial expression classification: An approach based on the fusion of facial deformations using the transferable belief model. International Journal of Approximate Reasoning, 46(3), 542-567. https://doi.org/10.1016/j.ijar.2007.02.003
    Hareli, S., & Hess, U. (2012). The social signal value of emotions. Cognition and Emotion, 26(3), 385-389.
    Hartwell, C. J., Johnson, C. D., & Posthuma, R. A. (2019). Are we asking the right questions? Predictive validity comparison of four structured interview question types. Journal of Business Research, 100, 122–129. https://doi.org/10.1016/j.jbusres.2019.03.026
    Hickman, L., Bosch, N., Ng, V., Saef, R., Tay, L., & Woo, S. E. (2022). Automated video interview personality assessments: Reliability, validity, and generalizability investigations. The Journal of Applied Psychology, 107(8), 1323–1351. https://doi.org/10.1037/apl0000695
    Hom, P. W., Mitchell, T. R., Lee, T. W., & Griffeth, R. W. (2012). Reviewing employee turnover: Focusing on proximal withdrawal states and an expanded criterion. Psychological Bulletin, 138(5), 831-858. https://doi.org/10.1016/j.hrmr.2009.04.002
    Huffcutt, A. I., Van Iddekinge, C. H., & Roth, P. L. (2011). Understanding applicant behavior in employment interviews: A theoretical model of interviewee performance. Human Resource Management Review, 21(4), 353–367. https://doi.org/10.1016/j.hrmr.2011.05.003
    Hurtz, G. M., & Donovan, J. J. (2000). Personality and job performance: The Big Five revisited. Journal of Applied Psychology, 85(6), 869-870. https://doi.org/10.1037/0021-9010.85.6.869
    Islam, M. F., & Alam, M. J. (2014). Factors influencing Intention to quit or stay in jobs: An empirical study on selected sectors in Bangladesh. Stamford Journal of Business Studies, 6(1), 142-164.
    Jack, Rachael E., Garrod, Oliver G. B., & Schyns, Philippe G. (2014). Dynamic Facial Expressions of Emotion Transmit an Evolving Hierarchy of Signals over Time. Current Biology, 24(2), 187-192. https://doi.org/https://doi.org/10.1016/j.cub.2013.11.064
    Jain, R., & Nayyar, A. (2018). Predicting Employee Attrition using XGBoost Machine Learning Approach. 2018 International Conference on System Modeling & Advancement in Research Trends, Moradabad, India. https://doi.org/10.1109/SYSMART.2018.8746940
    Jaser, Z., Petrakaki, D., Starr, R., & Oyarbide-Magaña, E. (2022). Where automated job interviews fall short. Harvard Business Review. https://hbr.org/2022/01/where-automated-job-interviews-fall-short
    Jones, J. W., Brasher, E. E., & Huff, J. W. (2002). Innovations in integrity‐based personnel selection: Building a technology‐friendly assessment. International Journal of Selection and Assessment, 10(1‐2), 87-97. https://doi.org/10.1111/1468-2389.00195
    Kaiser, H. F. (1974). An index of factorial simplicity. psychometrika, 39(1), 31-36. https://doi.org/10.1007/BF02291575
    Kaiser, H.F., “A second generation little jiffy,” Psychometrika, 35, 401-415. 1970 https://doi.org/10.1007/BF02291817
    Koo, K.M., & Cha, E.Y. (2017). Image recognition performance enhancements using image normalization. Human-centric Computing and Information Sciences, 7(1), 1-11. https://doi.org/10.1186/s13673-017-0114-5
    Kraut, A. I. (1975). Predicting turnover of employees from measured job attitudes. Organizational behavior and human performance, 13(2), 233-243. https://doi.org/10.1016/0030-5073(75)90047-1
    LeCun, Y., Kavukcuoglu, K., & Farabet, C. (2010, 30 May-2 June 2010). Convolutional networks and applications in vision. Proceedings of 2010 IEEE International Symposium on Circuits and Systems, Paris, France. https://doi.org/10.1109/ISCAS.2010.5537907
    Leng, G. E., & Chin, M. L. C. (2016). Person-job fit, personality, organizational commitment and intention to stay among employees in marketing departments. Jurnal Psikologi Malaysia, 30(1).
    Levashina, J., & Campion, M. A. (2006). A model of faking likelihood in the employment interview. International Journal of Selection and Assessment, 14(4), 299-316.
    https://doi.org/10.1111/j.14682389.2006.00353.x
    Li, S., Deng W. H. (2020). Deep facial expression recognition: a survey. Journal of Image and Graphics, 25(11), 2306-2320. https://doi.org/10.11834/jig.200233
    Liu, Y. J., Zhang, J. K., Yan, W. J., Wang, S. J., Zhao, G. & Fu, X. (2016). A Main Directional Mean Optical Flow Feature for Spontaneous Micro-Expression Recognition. IEEE Transactions on Affective Computing, 7(4), 299-310. https://doi.org/10.1109/TAFFC.2015.2485205
    Lilienfeld, S. O. (2012). Public skepticism of psychology: Why many people perceive the study of human behavior as unscientific. American Psychologist, 67(2), 111–129.https://doi.org/10.1037/a0023963
    Maertz Jr, C. P. and M. A. Campion (2004). Profiles in quitting: Integrating process and content turnover theory. Academy of Management Journal 47(4), 566-582.https://doi.org/10.2307/20159602
    Maertz Jr, C. P., & Griffeth, R. W. (2004). Eight motivational forces and voluntary turnover: A theoretical synthesis with implications for research. Journal of management, 30(5), 667-683.https://doi.org/10.1016/j.jm.2004.04.001
    Matsugu, M., Mori, K., Mitari, Y., & Kaneda, Y. (2003). Subject independent facial expression recognition with robust face detection using a convolutional neural network. Neural Networks, 16(5-6), 555-559. https://doi.org/10.1016/S0893-6080(03)00115-1
    McLarnon, M. J., DeLongchamp, A. C., & Schneider, T. J. (2019). Faking it! Individual differences in types and degrees of faking behavior. Personality and Individual Differences, 138, 88-95. https://doi.org/10.1016/j.paid.2018.09.024
    Mehendale, N. (2020). Facial emotion recognition using convolutional neural networks (FERC). SN Applied Sciences, 2(3), 1-8. https://doi.org/10.1007/s42452-020-2234-1
    Mejia, C., & Torres, E. N. (2018). Implementation and normalization process of asynchronous video interviewing practices in the hospitality industry. International Journal of Contemporary Hospitality Management, 30(2), 685-701. https://doi.org/10.1108/IJCHM-07-2016-0402
    Merget, D., Rock, M., & Rigoll, G. (2018). Robust facial landmark detection via a fully-convolutional local-global context network. Proceedings of the IEEE conference on computer vision and pattern recognition. https://doi.org/10.1109/CVPR.2018.00088
    Miroslava, B., & Ondrej, I. (2018). Personality traits in relation to the turnover intentions of the qualified employees in the manufacturing industry. Individual & Society/Clovek a Spolocnost, 21(2), 56-63. https://doi.org/10.31577/cas.2018.02.527
    Monaro, M., Maldera, S., Scarpazza, C., Sartori, G., & Navarin, N. (2022). Detecting deception through facial expressions in a dataset of videotaped interviews: A comparison between human judges and machine learning models. Computers in Human Behavior, 127, 107063. https://doi.org/10.1016/j.chb.2021.107063
    Mowday, R.T., Koberg, C. S. and McArthur, A.W. (1984), The psychology of the withdrawal process: a cross-validational test of Mobley’s intermediate linkages model of turnover in two samples, Academy of Management Journal, 27, 79-94 https://doi.org/10.5465/255958
    Nanncarrow, S., et al. (2014). Intention to Stay and Intention to Leave: Are They Two Sides of the Same Coin? A Cross-sectional Structural Equation Modelling Study among Health and Social Care Workers. Journal of Occupational Health.
    https://doi.org/10.1539/joh.14-0027-OA
    Nunnally, J. C. (1975). Psychometric Theory. 25 Years Ago and Now. Educational Researcher, 4(10), 7-21. https://doi.org/10.2307/1175619
    Nunnally, J. C. (1978). Psychometric Theory (2nd ed.). New York: McGraw-Hill
    Ogunfowora, B. (2013). The relationship between personality traits and employee engagement. Journal of Business and Psychology, 28(3), 337-348. https://doi.org/10.1007/s10869-012-9288-0
    Oh, Y.-H., See, J., Le Ngo, A. C., Phan, R. C.-W., & Baskaran, V. M. (2018). A survey of automatic facial micro-expression analysis: databases, methods, and challenges. Frontiers in psychology, 9, 1128. https://doi.org/10.3389/fpsyg.2018.01128
    Organ, D. W. (1994). Personality and organizational citizenship behavior. Journal of management, 20(2), 465-478. https://doi.org/10.1016/0149-2063(94)90023-X
    Pang, B., Nijkamp, E., & Wu, Y. N. (2020). Deep learning with tensorflow: A review. Journal of Educational and Behavioral Statistics, 45(2), 227-248. https://doi.org/10.3102/107699861987276
    Picard, R. W. (1997). Affective computing. Cambridge, MA: The MIT Press.
    Pitaloka, D. A., Wulandari, A., Basaruddin, T., & Liliana, D. Y. (2017). Enhancing CNN with Preprocessing Stage in Automatic Emotion Recognition. Procedia Computer Science, 116, 523-529. https://doi.org/https://doi.org/10.1016/j.procs.2017.10.038
    Provost, F., & Fawcett, T. (2001). Robust classification for imprecise environments. Machine learning, 42, 203-231. https://doi.org/10.1023/A:1007601015854
    Pursche, T., Clauß, R., Tibken, B., & Möller, R. (2019). Using neural networks to enhance the quality of ROIs for video based remote heart rate measurement from human faces. 2019 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA. https://doi.org/10.1109/ICCE.2019.8661915.
    R. Mehrabian (1982) Rapid solidification, International Metals Reviews, 27(1), 185-208. https://doi.org/10.1179/imr.1982.27.1.185
    Rasipuram, S., & Jayagopi, D. B. (2016, October). Asynchronous video interviews vs. face-to-face interviews for communication skill measurement: a systematic study. In Proceedings of the 18th ACM international conference on multimodal interaction, 370-377. https://doi.org/10.1145/2993148.2993183
    Raza, A. (2022). Predicting Employee Attrition Using Machine Learning Approaches. Applied Sciences, 2(13), 2-17. https://doi.org/10.3390/app12136424
    Rosenberg, E. L., Ekman, P., Jiang, W., Babyak, M., Coleman, R. E., Hanson, M., O'Connor, C., Waugh, R., & Blumenthal, J. A. (2001). Linkages between facial expressions of anger and transient myocardial ischemia in men with coronary artery disease. Emotion, 1(2), 107–115. https://doi.org/10.1037/1528-3542.1.2.107
    Rush K.L., Adamack M., Gordon J., Lilly M. & Janke R. (2013). Best practices of formal new graduate nurse transition programs: an integrative review. International Journal of Nursing Studies 50 (3), 345–356. https://doi.org/10.1016/j.ijnurstu.2012.06.009
    Scherer, L. L., Shaffer, M. A., & Ziemek, S. (2013). The relationship between the Big Five personality traits and job performance: A meta-analysis. Journal of Applied Psychology, 98(5), 869–881. https://doi.org/10.1037/a0033695
    Shreve, M., Godavarthy, S., Goldgof, D., & Sarkar, S. (2011, March). Macro-and micro-expression spotting in long videos using spatiotemporal strain. 2011 IEEE International Conference on Automatic Face & Gesture Recognition, 2011, 51–56. https://doi.org/10.1109/FG.2011.5771451
    Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124(2),262–274. https://doi.org/10.1037/0033-2909.124.2.262
    Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research, 15(1), 1929-1958.
    Su, Y. S., Suen, H. Y., & Hung, K. E. (2021). Predicting Behavioral Competencies Automatically from Facial Expressions in Real-Time Video Recorded Interviews. Journal of Real-Time Image Processing, 18(4), 1011-1021. https://doi.org/10.1007/s11554-021-01071-5
    Suen, H. Y., Chen, M. Y. C., & Lu, S. H. (2019). Does the use of synchrony and artificial intelligence in video interviews affect interview ratings and applicant attitudes?. Computers in Human Behavior, 98, 93-101. https://doi.org/10.1016/j.chb.2019.04.012
    Suen, H. Y., Hung, K. E., & Lin, C. L. (2020). Intelligent video interview agent used to predict communication skill and perceived personality traits. Human-centric Computing and Information Sciences, 10(1), 1-12. https://doi.org/10.1186/s13673-020-0208-3
    Sung, K. K., & Poggio, T. (1998). Example-based learning for view-based human face detection. IEEE Transactions on pattern analysis and machine intelligence, 20(1), 39-51. https://doi.org/10.1109/34.655648
    T. Pfister, Xiaobai Li, G. Zhao & M. Pietikäinen.(2011). Recognising spontaneous facial micro-expressions. 2011 International Conference on Computer Vision, 2011, pp. 1449-1456. https://doi.org/10.1109/ICCV.2011.6126401
    Tabachnick, B.G. & Fidell, L.S., Using multivariate statistics (6th ed.), Pearson, 2013.
    Takalkar, M., Xu, M., Wu, Q., & Chaczko, Z. (2018). A survey: facial micro-expression recognition. Multimedia Tools and Applications, 77(15), 19301–19325. https://doi.org/10.1007/s11042-017-5317-2
    Tett, R. P., & Meyer, J. P. (1993). Job satisfaction, organizational commitment, turnover intention, and turnover: path analyses based on meta‐analytic findings. Personnel psychology, 46(2), 259-293. https://doi.org/10.1111/j.1744-6570.1993.tb00874.x
    Torres, E. N., & Mejia, C. (2017). Asynchronous video interviews in the hospitality industry: Considerations for virtual employee selection. International Journal of Hospitality Management, 61, 4-13. https://doi.org/10.1016/j.ijhm.2016.10.012
    Trautmann, S. A., Fehr, T., & Herrmann, M. (2009). Emotions in motion: Dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations. Brain Research, 1284, 100-115. https://doi.org/10.1016/j.brainres.2009.05.075
    Viola, P., & Jones, M. J. (2004). Robust real-time face detection. International Journal of Computer Vision, 57(2), 137-154. https://doi.org/10.1023/B:VISI.0000013087.49260.fb
    Waller, B. M., Whitehouse, J., & Micheletta, J. (2017). Rethinking primate facial expression: A predictive framework. Neuroscience & Biobehavioral Reviews, 82, 13-21. https://doi.org/10.1016/j.neubiorev.2016.09.005
    Walther, J. B. (2011). Theories of computer-mediated communication and interpersonal relations. In M.L. Knapp, J.A. Daly (Ed.), The Handbook of Interpersonal Communication (4th ed., pp.443–479). SAGE.
    Wang, P., & Ji, Q. (2004, August). Multi-View Face Detection under Complex Scene based on Combined SVMs. In ICPR (4) (pp. 179-182).https://onlinelibrary.wiley.com/doi/pdf/10.1111/1468-2389.00195?casa_token=4lqpHoJ4IDsAAAAA:Fogz7gKTbNkPPUYNxO7f1nIt5Er04klYJeIXiG4IIOX2z_zZiB9ci9OaItM-fsxCLiChwFQRb7_EiFeFJQ
    Wang, S. J., Chen, H. L., Yan, W. J., Chen, Y. H., & Fu, X. (2014). Face recognition and micro-expression recognition based on discriminant tensor subspace analysis plus extreme learning machine. Neural processing letters, 39(1), 25-43. https://doi.org/10.1007/s11063-013-9288-7
    Warren, G., Schertler, E., & Bull, P. (2009). Detecting deception from emotional and unemotional cues. Journal of Nonverbal Behavior, 33, 59–69. https://doi.org/10.1007/s10919-008-0057-7
    Wiles, O., Koepke, A., & Zisserman, A. (2018). Self-supervised learning of a facial attribute embedding from video. https://doi.org/10.48550/arXiv.1808.06882
    Yaktin, U. S., Azoury, N. B., & Doumit, A. A. (2003). Personal characteristics and job satisfaction among nurses in Lebanon. Journal of Nursing Adminstraction, 33(7), 384-390. https://doi.org/10.1097/00005110-200307000-00006
    Yamashita, R., Nishio, M., Do, R. K. G., & Togashi, K. (2018). Convolutional neural networks: an overview and application in radiology. Insights into Imaging, 9(4), 611-629. https://doi.org/10.1007/s13244-018-0639-9
    Yang, S., & Bhanu, B. (2012). Understanding discrete facial expressions in video using an emotion avatar image. IEEE Transactions on Systems, Man, and Cybernetic, 42(4), 980-992. https://doi.org/10.1109/TSMCB.2012.2192269
    Yeasin, M.., Bullot, B., & Sharma, R. (2006). Recognition of facial expressions and measurement of levels of interest from video. IEEE Transactions on Multimedia, 8(3), 500-508. https://doi.org/%2010.1109/TMM.2006.870737
    Yudin, D.A., Dolzhenko, A.V., & Kapustina, E.O. (2020). The Usage of Grayscale or Color Images for Facial Expression Recognition with Deep Neural Networks. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V., Tiumentsev, Y. (Eds), Advances in Neural Computation, Machine Learning, and Cognitive Research III. NEUROINFORMATICS 2019. Studies in Computational Intelligence, Springer, Cham. https://doi.org/10.1007/978-3-030-30425-6_32
    Zhang, K., Wang, W., Lv, Z., Jin, L., Liu, D., Wang, M., & Lv, Y. (2022). A CNN-based regression framework for estimating coal ash content on microscopic images. Measurement, 189, 110589. https://doi.org/10.1016/j.measurement.2021.110589

    無法下載圖示 電子全文延後公開
    2029/01/09
    QR CODE