2012 IEEE Conference on Computer Vision and Pattern Recognition
DOI: 10.1109/cvpr.2012.6247951
Full text: Download
Hidden Markov Models (HMMs) are among the most important and widely used techniques to deal with sequential or temporal data. Their application in computer vision ranges from action/gesture recognition to videosurveillance through shape analysis. Although HMMs are often embedded in complex frameworks, this paper focuses on theoretical aspects of HMM learning. We propose a regularized algorithm for learning HMMs in the spectral framework, whose computations have no local minima. Compared with recently proposed spectral algorithms for HMMs, our method is guaranteed to produce probability values which are always physically meaningful and which, on synthetic mathematical models, give very good approximations to true probability values. Furthermore, we place no restriction on the number of symbols and the number of states. On various pattern recognition data sets, our algorithm consistently outperforms classical HMMs, both in accuracy and computational speed. This and the fact that HMMs are used in vision as building blocks for more powerful classification approaches, such as generative embedding approaches or more complex generative models, strongly support spectral HMMs (SHMMs) as a new basic tool for pattern recognition.