跳到主要內容

簡易檢索 / 詳目顯示

研究生: 葉致偉
Yeh, Chih-Wei
論文名稱: 智慧型仿鏡互動顯示裝置
Magic Mirror: A Research on Smart Display Devices
指導教授: 廖文宏
Liao, Wen-Hung
學位類別: 碩士
Master
系所名稱: 理學院 - 資訊科學系
論文出版年: 2006
畢業學年度: 94
語文別: 英文
論文頁數: 50
中文關鍵詞: 人機介面智慧顯示裝置智慧家電肢體人機介面
外文關鍵詞: Human Computer Interface, Smart Display Device, Smart Furniture, Gesture User Inteface
相關次數: 點閱:68下載:37
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 以肢體動作為基礎的人機介面一直被認為是未來家庭人機介面的表徵,然而,由於缺乏適合的應用環境和辨識技術,相關的應用尚未成熟。本研究嘗試提出一個互動式仿鏡顯示裝置作為肢體指令的平台,並提出相關的辨識技術,以設計一個可應用在智慧家庭環境中,符合人因工程的互動顯示裝置。


    Gesture-based user interfaces have long been associated with the image of future technology. However, due to the lack of proper environments and recognition technologies, practical applications of intelligent user interfaces are still rare in modern life. In this research, we propose an interactive mirror which can be controlled by gesture commands. We also provide several recognition techniques for this interactive display device. Practical applications are developed on this smart mirror, and user test is conducted to evaluate this novel user interface.

    CHAPTER 1 INTRODUCTION 1
    1.1 The Magic Mirror 2
    1.2 Related Work 5
    1.2.1 Mirror TV 6
    1.2.2 Home Appliances Controlled by Intelligent User Interfaces 7
    1.3 Overview 9
    CHAPTER 2 FACE DETECTION AND TRACKING 10
    2.1 Face Detection 10
    2.1.1 AdaBoosting Face Detector 11
    2.1.2 Skin-Color Face Filter 13
    2.2 Face Tracking 16
    2.2.1 Mean Shift Tracker 16
    2.2.2 Local Feature Tracker 18
    2.3 Performance Evaluation 20
    CHAPTER 3 GESTURE RECOGNITION 25
    3.1 Gesture Commands 25
    3.2 Gesture Pointing 27
    3.3 Gesture Selection 31
    3.3.1 Nodding and Head-shaking Detection 31
    CHAPTER 4 INFORMATION FUSION AND SPECIAL EFFECTS 33
    4.1 Information Fusion 33
    4.1.1 Message of The Day 33
    4.1.2 News Clip Playback 34
    4.2 Special Effects 35
    4.2.1 Wear-a-Tie 35
    4.2.2 Facial Complexion Enhancement 36
    CHAPTER 5 DISPLAY MANAGER 39
    5.1 Display Guidelines 39
    5.2 Attentive Levels 41
    CHAPTER 6 USER TEST 43
    6.1 The Response Time Test 43
    CHAPTER 7 CONCLUSION AND FUTURE WORK 46
    REFERENCE 48

    [1] E. Lee, T. Nakra Marrin and J. Borchers, "You're the conductor: A realistic interactive conducting system for children," in Proc. 2004 Int. Conf. on New Interfaces for Musical Expression, Hamamatsu, Japan, pp. 68-73, Jun. 2004.
    [2] V. Henderson, S. Lee, H. Brashear, H. Hamilton, T. Starner and S. Hamilton, "Development of an American Sign Language game for deaf children," in Proc. 4th Int. Conf. on Interaction Design and Children, Boulder, CO, USA, pp. 70-79, Jun. 2005.
    [3] L. Zhang, Y. Chen, G. Fang, X. Chen and W. Gao, "A vision-based sign language recognition system using tied-mixture density HMM," in Proc. 6th Int. Conf. on Multimodal Interfaces, State College, PA, USA, pp. 198-204, Oct. 2004.
    [4] W. Freeman and C. Weissman, "Television control by hand gestures," in Proc. IEEE Int. Workshop on Automatic Face and Gesture Recognition, 1995, pp. 179-183, Jun. 1995.
    [5] A. Wilson and N. Oliver, "GWindows: robust stereo vision for gesture-based control of windows," in Proc. 5th Int. Conf. on Multimodal Interfaces, Vancouver, British Columbia, Canada, pp. 211-218, Nov. 5-7 2003.
    [6] R. Vertegaal, "Attentive user interfaces: Introduction," Commun. ACM, vol. 46, pp. 30-33, 2003.
    [7] http://www.research.philips.com/technologies/display/mrrordisp/mirrortv/
    [8] http://www.seuratvmirror.com
    [9] T. Darrell, G. Gorden, J. Woodfill and M. Harville, "A Virtual Mirror Interface using Real-time Robust Face Tracking," in Proc. 3th IEEE Int. Conf. on Automatic Face and Gesture Recognition, Nara, Japan, pp. 616-621, Apr. 14-16, 1998.
    [10] W. Liao and T. Y. Li, "DD Theme Party: An Interactive Multimedia Showcase," in Proc. 2005 Int. Conf. on Digital Archive Technologies, Taipei, Taiwan, pp. 279-280, Jun. 16-17, 2005.
    [11] J. Goto, K. Komine, Y. Kim and N. Uratani, "A Television Control System based on Spoken Natural Language Dialogue," in Proc. 9th IFIP TC13 Int. Conf. on Human-Computer Interaction, Zürich, Switzerland, pp. 765-768, Sep. 1-5. 2003.
    [12] J.S. Shell, R. Vertegaal and A.W. Skaburskis, "EyePliances: attention-seeking devices that respond to visual attention," in extended abstracts of 2003 ACM SIGCHI Conf. on Human factors in computing systems, Fort Lauderdale, FL, USA, pp. 770-771, Apr. 2003.
    [13] R.J. Orr and G.D. Abowd, "The smart floor: a mechanism for natural user identification and tracking," in extended abstracts of 2000 ACM SIGCHI Conf. on Human factors in computing systems, Hague, Netherlands, pp. 275-276, Apr. 1-6, 2000.
    [14] M. Yang, D.J. Kriegman and N. Ahuja, "Detecting faces in images: a survey," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 24, pp. 34-58, Jan 2002.
    [15] P. Viola and M.J. Jones, "Robust Real-Time Face Detection," Int. J. Computer Vision, vol. 57, pp. 137-154, 2004.
    [16] S. Baluja, M. Sahami and H. Rowley A., "Efficient Face Orientation Discrimination," in Proc. 2004 Int. Conf. on Image Processing, Singapore, vol. 1, pp. 589-592, Oct. 24-27, 2004.
    [17] B.D. Zarit, B.J. Super and F.K.H. Quek, "Comparison of Five Color Models in Skin Pixel Classification," in Proc. Int. Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, 1999, Corfu, Greece, pp. 58, Sep. 26-27, 1999.
    [18] J. Yang, R. Stiefelhagen, U. Meier and A. Waibel, "Visual tracking for multimodal human computer interaction," in Proc. 1998 ACM SIGCHI Conf. on Human Factors in Computing Systems, Los Angeles, CA, USA, pp. 140-147, Apr. 18-23, 1998
    [19] Y. Cheng, "Mean Shift, Mode Seeking, and Clustering," IEEE Trans. On Pattern Analysis and Machine Intelligence, vol. 17, pp. 790-799, 1995.
    [20] G.R. Bradski, "Real Time Face and Object Tracking as a Component of a Perceptual User Interface," in Proc. 4th IEEE Workshop on Applications of Computer Vision, pp. 214, Oct. 1998.
    [21] J.L. Barron, D.J. Fleet and S.S. Beauchemin, "Performance of optical flow techniques," Int. J. Computer Vision, vol. 12, pp. 43-77, 1994.
    [22] F. Bourel, C. Chibelushi C. and A. Low A., "Robust Facial Feature Tracking," in Proc. 11th British Machine Vision Conference, Bristol, UK, vol. 1, pp. 232-241, Sep. 2000.
    [23] D. Lucas and T. Kanade, "An iterative image registration technique with an application in stereo vision," in Proc. 1981 Int. Joint Conf. on Artificial Intelligence , pp. 674-679, 1981.
    [24] J. Shi and C. Tomasi, "Good features to track," in Proc. 1994 IEEE Conf. on Computer Vision and Pattern Recognition, Seattle, Washington, USA, pp. 593-600, Jun. 20-24, 1994.
    [25] Julien Letessier and François Bérard, "Visual Tracking of Bare Fingers for Interactive Surfaces", in Proc. 17th annual ACM symposium on User Interface Software and Technology, Santa Fe, New Mexico, USA, pp. 119-122, Oct. 24-27, 2004
    [26] Shinjiro Kawato and Jun Ohya, "Real-time Detection of Nodding and Head-shaking by Directly Detecting and Tracking the “Between-Eyes”", in Proc. 4th Int. Conf. on Automatic Face and Gesture Recognition, Grenoble, France, pp.40-45, Mar. 28-30, 2000
    [27] Ashish Kapoor and Rosalind W. Picard, "Real-Time Head Nod and Shake Detector", in Proc. 2001 workshop on Perceptive User interfaces, Orlando, Florida, USA, pp1-5, Nov. 15-16, 2001

    QR CODE
    :::