Gesture recognition in a meeting environment

Hassink, N. and Schopman, M.G. (2006) Gesture recognition in a meeting environment.

[img]
Preview
PDF
2MB
Abstract:This thesis describes a research project related to gesture recognition in a meeting environment. In this research project we want to determine where the challenges lie in gesture recognition and what the recognition performance is when we apply existing machine learning techniques in a real life setting such as meetings. The research is split up in four parts. The first part is feature selection. This part encompasses the process of analyzing meetings on useful gestures, annotating these gestures, parameterization with possible features and selecting the most useful features. The second part is segmentation, the process of automatically locating gestures in a meeting. Two segmentation approaches are examined; whole gesture segmentation and gesture part segmentation. Two methods, BIC and AM, are compared for each approach. The third part is feature clustering, the mapping of continuous data to discrete data. Two methods are compared for this purpose, K-Means and Expectation Maximization. The final part is classification, labeling data parts with the correct gesture label. Hidden Markov models are used for classification. The main goal is to compare the classification performance on annotated gestures with the classification performance on automatically segmented gestures. From these four phases follows the project¿s conclusions and recommendations.
Item Type:Essay (Master)
Faculty:EEMCS: Electrical Engineering, Mathematics and Computer Science
Subject:54 computer science
Programme:Human Media Interaction MSc (60030)
Link to this item:http://purl.utwente.nl/essays/56211
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page