研究生: |
陳凱恩 Chen, Kai-En |
---|---|
論文名稱: |
於擴增實境中虛擬物件的光照與陰影之擬真 Lighting and Shadowing for Virtual Objects in Augmented Reality |
指導教授: |
張鈞法
Chang, Chun-Fa |
學位類別: |
碩士 Master |
系所名稱: |
資訊工程學系 Department of Computer Science and Information Engineering |
論文出版年: | 2019 |
畢業學年度: | 107 |
語文別: | 中文 |
論文頁數: | 31 |
中文關鍵詞: | 擴增實境 、環景圖 、ARCore |
英文關鍵詞: | Augmented Reality, ARCore, Equirectangular Panorama |
DOI URL: | http://doi.org/10.6345/NTNU201900600 |
論文種類: | 學術論文 |
相關次數: | 點閱:165 下載:35 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
隨著硬體設備的型態變得輕巧、支援更多偵測功能以及系統分析的提升,近幾年擴增實境普遍的在行動裝置上應用,人們藉此體驗真實生活裡不存在的情境、和非現實的生物互動,亦或是作為工具測量真實物品、模擬室內擺設,因此越貼近真實生活的應用,虛擬物體與真實環境的不協調問題越加明顯,虛擬物體實質上仍存在於虛擬世界,獲取真實環境數據的多寡直接影響虛實的融合程度,近代攝影技術搭配機器學習試圖即時的從環境中獲取更多環境資訊,但分析一部分的環境畫面與真實世界裡無數光線所累積而成的光照相比,仍舊可以從虛擬物體的陰影與表面發現破綻;而非即時的環境分析雖然會受變化的程度降低靜態數據的效果,但是以較全面的環境圖像補充缺漏的資訊,得以在繪製過程中考慮到整個真實環境的影響。
本研究探討如何讓虛擬物體以合理之姿出現在現實畫面中,透過ARCore即時分析環境以及重構虛擬世界,並以OpenGL ES銜接空間資訊繪製模型,採用環景圖補充缺漏的環境數據,在即時繪製下即使拍攝範圍不足,仍可以計算整個環境對虛擬物體的影響,並進一步提升真實感。
With the development of mobile devices, the augmented reality becomes more common in our daily life. While augmented reality is emerging as one of key technology in life, the problem of combination of virtual objects and real world becomes obvious. The augmented reality system must detect the real objects within the environment. Then, people can interact with this enhanced version of reality.
In recent years, developers have tried to analyze environment in real-time. However, based on the camera of mobile devices, it is hard to consider the whole environment lighting from the partial view. Therefore, we use panorama images to get the complete light estimate and improve the reflection effects.
In this paper, we analyze equirectangular panorama images and store them as offline data about environment lighting. Then, we detect the environment in real-time and reconstruct a virtual world by ARCore. Finally, we use OpenGL ES to draw the models from those data and the perception of virtual objects can be improved.
[1]Google Developers, “ARCore Overview”, [Online] Available:
https://developers.google.com/ar/discover/ [Accessed 1 2019]
[2]Smith, R. C., & Cheeseman, P. (1986). “On the representation and estimation of spatial uncertainty.” The international journal of Robotics Research, 5(4), 56-68.
[3]Joey de Vries, “Hello Triangle”, [Online] Available:
https://learnopengl.com/Getting-started/Hello-Triangle. [Accessed 6 2019]
[4]Mehta, S. U., Kim, K., Pajak, D., Pulli, K., Kautz, J., & Ramamoorthi, R. (2015, June). “Filtering Environment Illumination for Interactive Physically-Based Rendering in Mixed Reality.” In EGSR (EI&I) (pp. 107-118).
[5]Green, R. (2003, March). “Spherical harmonic lighting: The gritty details.” In Archives of the Game Developers Conference (Vol. 56, p. 4).
[6]Debevec, P. (2008, August). “Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography.” In ACM SIGGRAPH 2008 classes (p. 32). ACM.
[7]Google Developers, “Updates to ARCore Help You Build More Interactive & Realistic AR Experiences”, [Online] Available: https://developers.googleblog.com/2019/05/ARCore-IO19.html [Accessed 5 2019]
[8]陳佑欣. (2017). 虛擬實境中基於物理之擬真材質的繪製 (Masters dissertation).
[9]Blinn, J. F., & Newell, M. E. (1976). “Texture and reflection in computer generated images.” Communications of the ACM, 19(10), 542-547.
[10]Paul Debevec, “The Story of Reflection Mapping”, [Online] Available:
https://www.pauldebevec.com/ReflectionMapping/ [Accessed 3 2019]
[11]Chan, E., & Durand, F. (2003, June). “Rendering Fake Soft Shadows with Smoothies.” In Rendering Techniques (pp. 208-218)
[12]Supan, P., Stuppacher, I., & Haller, M. (2006). “Image Based Shadowing in Real-Time Augmented Reality.” IJVR, 5(3), 1-7.